Tech for Good Toolkit

Commonwealth’s Emerging Tech for All (ETA) initiative aims to ensure that the design of new financial technologies integrates the needs, wants, and aspirations of people living on low to moderate incomes (LMI). We are shedding light on these perspectives by partnering with financial organizations, fintechs, and platform providers who implement emerging technologies to research the needs and wants of customers living on LMI.

This toolkit is intended to provide design guidance for financial institutions and other organizations that offer financial services to populations living on LMI using emerging technologies like conversational AI (chatbots) and automated financial management software. The toolkit is organized around three primary goals for the provision of effective and accessible financial services for populations living on LMI: 

This toolkit will evolve as our research in this area progresses. Below, find insights into how we developed these goals based on existing barriers, as well as ideas for effective design.

Build Trust

Before users living on LMI are willing to engage with conversational AI, financial automation, or other emerging technologies, they need to feel that communications are secure and the information they receive is trustworthy. Building trust is especially important in light of financially vulnerable Americans’ history of exclusion from financial services and higher likelihood of experience with predatory alternative financial products.

Emphasize Data Security

Threats to personal data privacy are top of mind for many Americans, and the risks of identity theft and other security breaches are particularly significant in a financial context.¹ In Commonwealth’s research on financial chatbot usage, 42% of users reported that they were concerned about the security of information shared with chatbots. 

  • Include branding on chatbot or apps. Users are more likely to trust tools from companies they already have a relationship with and have interacted with positively in the past.
  • Include upfront messaging about data security. Clearly reporting what measures exist to secure and protect data can communicate that organizations are aware of and proactive about data security before chatbots or other software requests personal information.
  • Give users control over what data is stored. Because the success of emerging financial technologies like chatbots often depends on sharing some personal information, addressing these concerns is an important trust-building strategy.
Clarify the Source of Guidance

Top user concerns when considering asking a chatbot for guidance were that the chatbot would push products they didn’t need, that the information would be inaccurate, or that the guidance given by chatbots would not address their needs.²

  • Include clear messaging explaining that customers are interacting with a chatbot and not a human. Our research indicates that banking customers are often unsure if they are chatting with a human agent or a chatbot. This uncertainty can set chatbot interactions up for failure if a user believes they are interacting with a live agent.
  • When making product recommendations, consider connecting to real customer reviews or action steps taken by other customers. Helping clients connect recommendations to other real clients in similar financial situations may help build trust with clients who would otherwise feel like the chatbot is simply pushing a product.³
Provide a Human Connection

While use of and comfort with chatbots has increased significantly since the pandemic, users retain a strong preference for interacting with a real personespecially for more complicated issues. Our research indicates that when given a choice between a human and a chatbot, there is a strong preference for human assistance. 

  • Make it easy to connect to a human agent. Making it easy to move from a chatbot conversation to a live agent can improve openness to financial chatbots by creating a pathway to solutions rather than a potential dead end. This potential for human connection may play an important role in applications of various customer-facing technologies. Knowing that a human is available as an alternative if they run into issues with new technology makes them more willing to give these novel pathways a shot.
Offer Low-Risk Opportunities to Engage with New Technologies

Many concerns about chatbots and AI tools are rooted in a lack of familiarity with these technologies. While user trust and comfort takes time to develop, financial institutions and organizations may be able to accelerate this process by exposing users to new technologies in low-risk ways. 

  • Give users examples of potential chatbot interactions. In one-on-one interviews with Commonwealth, users had reservations about AI and chatbots when speaking in abstract terms, but could identify specific use cases relevant to them when presented with examples. Offering users access to demos or examples to familiarize themselves with new technologies may help build a foundation of trust before situations that are more urgent or require sharing sensitive information.

Drive Engagement

Ensuring that users living on LMI get the most out of chatbots and other emerging financial technologies means working to create an engaging experience in which users feel like they understand what these tools can do for them and when.

Create Clarity Around Functions

When asked about their priorities for chatbot features, users most often named an ability to understand a wide variety of questions, the option to connect to a live agent, and an ability for a chatbot to connect to accounts and real-time financial data. 

  • Put function first to rebuild trust in the reliability of chatbots. In Commonwealth’s research, participants consistently selected a chatbot’s ability to connect to a live agent, its ability to understand a wide range of questions, and the ability to connect to real-time financial data among their top priorities.
  • Give clarity on what the chatbot can and cannot accomplish. Users expressed the most frustration when they felt continuously redirected by the chatbot experience without arriving at the information they needed. Chatbot designers can avoid this by providing examples of topics the chatbot can help with, and designing clear flows that end redirection loops.
Pursue Intelligent Proactivity

Without any proactive engagement, it is easy for customers to overlook the existence of chatbots that might be able to assist them effectively. At the same time, pushy chatbots that are too aggressive in making their presence known can feel “spammy” and turn users off. Intelligent proactivity is about designing consumer-facing technologies that can recognize the moments when users might benefit from assistance and make suggestions in a way that invites engagement. 

  • Have chatbots pop up on specific webpages where actions can be taken. When chatbots pop up immediately before they are needed, users are more likely to disregard them later on when they could be useful. 
  • Avoid chatbot messaging that immediately suggests a specific product or service. Instead, consider opening with information about the different ways a chatbot can help. Chatbots that are perceived as vehicles for advertising or other offers can turn users off.
Create Personalized Experiences

When it comes to making financial decisions, users are looking for advice that not only draws on industry best practices, but also on their personal values and goals. When asked what types of personal information users would be willing to share with a chatbot, users were much more willing to share about their personal financial goals or values compared to their financial data. Offering ways for users to share information can increase engagement and build trust.

  • Connect with customers’ values. When given the opportunity to set goals within a financial tool, participating users set an average of five goals, suggesting that users are willing to provide more detailed information when it can lead to greater personalization. Connecting with users on a personal level and incorporating their individual values and preferences into financial recommendations not only allows for higher quality recommendations, but can also increase user comfort with sharing information generally. 

Increase Value

Designing emerging technologies in an inclusive way means working to understand the financial needs of customers living on low and moderate incomes and incorporating those needs into design choices to maximize value. 

Anticipate the Financial Needs of Users Living on LMI

Customers living on LMI represent a large and diverse group that cannot be reduced to a single shared set of needs or preferences. Delivering value through the design of tech products ultimately depends on understanding the alignment between technological capabilities and the needs of specific audiences. At the same time, there are common economic barriers and situations that should be kept in mind which may be relevant for the design of emerging technologies.

  • Customers living on LMI are more likely to have limited local branch access. Decreased access to physical banking branches in lower-income communities creates an opportunity for chatbots and other financial technologies to fulfill a broader range of banking needs.
  • Customers living on LMI have limited time. Users often feel that chatbots are a waste of time that lead them to dead ends or around in circles. Ensuring that chatbots can either provide action-oriented guidance quickly or recognize the need to transfer to a human agent is key to building stronger associations between chatbots and quick solutions.
Create Action-Oriented Chatbots

Although chatbots are perhaps most commonly thought of as sources of information that can provide guidance or recommendations, our research suggests that the ability to complete basic transactions through a chatbot interface has significant appeal.⁴ 

  • Support online banking services like sending payments through a chatbot. Digital payments are an essential financial service, particularly for households living on low to moderate incomes. More action-oriented chatbots that can facilitate easy payments provide an opportunity to design technology to match the needs of this group. 
  • Give users confirmations of completed actions through chatbots. Action-oriented chatbots that report back to users when a payment dispute has been resolved, a transfer has been made, or a credit card has been turned off or on can increase the usefulness of these tools for users.
Balance Automation and Control

Automated features can make it easier for customers to reach their financial goals with lower effort and time commitments. Simultaneously, users with lower incomes may have more fears about automating their financial lives when margins are small and errors can create significant setbacks. One way to increase engagement with automated financial tools is to allow users to have control over how much automation is used or to put guardrails in place to ensure they don’t overdraft any accounts. 

  • Allow users to turn automated finance features on and off. Users may have volatile financial situations or anticipate future changes in their income and expenses. Allowing users to easily turn features on and off can help them engage with automated finance, while still knowing that they can update their preferences quickly should the need arise. Even if the technology involved is capable of adapting to these changes, allowing customers to turn it off will increase trust and demonstrate that they are still in control. 
  • Include “safety net” features that pause automated movement of money if their balance is below a certain threshold. When credit union users were surveyed about potential features in an automated finance app, 89% of respondents expressed interest in a feature that would pause savings deposits automatically based on account balances. Many financially vulnerable Americans have experienced getting charged an unexpected overdraft fee and are wary of any tool that may unintentionally lead to overdrafted accounts. 

 

Resources

Key Publications


This research and associated work was completed with the support of JPMorgan Chase & Co. The views and opinions expressed in this toolkit are those of the authors and do not necessarily reflect the views and opinions of these supporters or their affiliates.


¹ In a survey of households living on low incomes conducted by Harvard University’s Joint Center for Housing Studies on where households felt most comfortable cashing their checks, the most important element was that their information was kept confidential.

² In an FDIC survey, 59.4% of households without bank accounts reported that they do not trust banks, and cited feeling like banks were more focused on getting them to borrow money than on helping them save. In Commonwealth’s research, only 41% of respondents expected chatbots to provide correct answers, and only 28% believed a chatbot could give effective solutions to their financial issues. 

³ In the sphere of online shopping, 91% of respondents to a 2019 survey said that they read reviews before making a purchasing decision. 

⁴ While only 8% of respondents preferred a chatbot to a human for resolving a banking-related issue and 13% preferred a chatbot for receiving financial advice, 36% of respondents said they would rather talk to a chatbot than a human to send payments or complete banking transactions.

    Complete the form to download this research.