Churn Sucks
Churn is notoriously the hardest part about running a successful SaaS business. Acquiring users is always doable with enough elbow grease put into organic channels or enough cash put into paid channels, but keeping users around – that's much more challenging.
If you're anything like me, you've been watching the crazy explosion of LLMs, led by ChatGPT, with some skepticism. It's clear that these tools do things that no software tools have been able to do before, but it's quite unclear whether those things are actually useful to users. Sure, it can generate text for them – draft a blog post, suggest improvements to their writing, tweak an email – but are those going to have any measurable impact on retention?
I'd argue that in most cases the answer is no. I'd be highly doubtful if someone told me that Notion's AI toolset increased retention by any significant number, for example.
But what if we could use LLMs to give our users some new magic moments and keep them from churning?
The Point When Users Churn
When you get a new user, that user is there because they think that your software product can solve a problem for them, usually by saving them either time or money. If they churn, it's because your product failed, or succeeded but to a lesser degree than a competing tool.
Your solution can fail in a million ways, because every software has limits. The most painful limits for users often aren't technical ones, but UI tradeoffs where the designers opted for excluding a potentially useful feature in favor of simplicity. Consider the following cases:
- In Google Drive, a user wants to reorganize all their documents to follow a new naming scheme or folder structure.
- In Notion, a user wants to rename a dozen documents to match a new pattern, for example "Part 1" to "Part 01" or "Friday 12/3 Memo" to "Dec 3 Memo".
- In a podcasting app, a user wants to export a report with only the episodes that have "Interview" in the title and the key stats from each.
In each of these examples, the software in question is theoretically capable of performing the operations that the user wants to execute. There's just no UI that streamlines the process because it's a fairly niche request. So the user has to go step by step, following a robotic process that takes forever. And in the face of that amount of work, the user is going to shop around for other tools that have the little time-saving UI element that they need. If they find it, they churn – and you've lost them for good.
As you can imagine, there's a long tail of scenarios like the above, and you can't possibly cover them all with UI, or your SaaS would look like a ridiculous mishmash of niche tools and UI elements. Luckily, it turns out that LLMs outperform at covering these bases and keeping your users happy.
LLMs Succeed Where Your UI Fails
LLMs are good at processing text and yielding more text. Applying these skills to the above problem set is simple, then. We need to:
- Be able to provide the list of available operations to the LLM, in a concise, plaintext format;
- Be able to provide the context around the user's request, such as the currently viewed list of documents in Notion's case;
- Allow the user to type in plaintext what they want to accomplish;
- Process the LLM's output into instructions for your SaaS using the provided list of available operations.
Putting this all together, you can give your user the power to type descriptions of their desired outcome in plain English, and convert that into a proposed list of actions to execute. If the user approves the list, you execute those instantly, and save the user hours of time — but most importantly, you keep them in your tool and give them no incentive to shop around.
Implementation
Implementing the above is non-trivial, as you'll need to set up your integration with an LLM, figure out how to serialize and reverse serialize your application's context and operations into text, set up a chat-like UI for the user and tie it all together. It's certainly doable, but...
The Fast Way
At Rehance we've made it easy for you to cover your bases and retain your users. Define the available actions and context in our UI, and add our drop-in JS script to your site to get the chat interface up and running. Theme of the UI widget is customizable, and integration of the drop-in script should take a front-end engineer an hour or so depending on the number of operations you're looking to support. Check it out at rehance.ai.
The Slow Way
Here's the roadmap:
- Define a set of actions. Each action needs to have a set of parameters. Only include the bare necessities, as too much complexity will confuse the LLM you use.
- Define the shape of the context that may be needed to perform the provided actions. The details here depend on your use case, so you'll need to figure out what shape works best for you. As with the actions, minimize the number of fields and keep the data types as simple as possible.
- Create a chat-like UI interface for your users. They should be able to enter text describing what they're looking to accomplish.
- Construct a prompt for an LLM, providing the context and actions, along with the user request. Process the response, and convert it into actual operations performed in your software application. Handle errors and edge cases appropriately.
- Run analytics on your users' prompts to figure out what actions they use the chat tool for most - these are likely the most frustrating parts of using your application.
Building this type of interface into every piece of software is a no-brainer. Text inputs won't replace traditional UIs by any means, but they can certainly cover all of its blind spots.