Applying LLMs to Increase the Efficiency of SaaS Applications

Explores the potential application of LLMs to increase the speed at which users can accomplish tasks in SaaS applications.

·5 minute read
Cover for Applying LLMs to Increase the Efficiency of SaaS Applications

All Software Consists of Operations

In any software application, users are looking to accomplish some set of tasks. Many of these tasks are some special case of CRUD operations (create, read, update, delete). Everything from CRMs to project management tools to accounting software to social media platforms qualify as such. The differentiator between these applications is the type and shape of the data being operated on, the permissions of the data (who can read, write, update, and delete it), and the permitted operations.

Users must not be permitted to perform any arbitrary operation on the data, even if they have the permissions to that data, as this would lead to a bloated, confusing, and ultimately unusable UI. Instead, the operations that are permitted are carefully selected to be the most common and important ones. In a project management tool, for example, users can create and manage issues at a high level. But it is doubtful that such a software would contain a tool that allows them to perform highly specific operations like "change the assignee of all issues with the label 'bug' that were created in the last 24 hours." This is because the number of users who would need to perform such an operation is negligible, and the complexity of the UI required to support it would be high. Thus, even though the value of such an operation would be extremely high for the users who need it, it is not supported.

The UI Complexity vs. Value Tradeoff is Dead

The tradeoff described above is dead — or at least it soon will be, killed by ChatGPT and other large language models. LLMs are uniquely capable of processing plain English and drawing conclusions from that input, parsing it and transforming it to desired outputs.

For example, given a list of permitted operations that a software is capable of performing, an LLM would be quite good at transcribing a user's plain English request (to use the previous example, "change the assignee of all issues with the label 'bug' that were created in the last 24 hours") into a JSON-formatted list of operations like this one:

{
  "operation": "Fetch Issues",
  "parameters": {
    "created": ">2024-01-31",
    "label": "bug"
  }
},
{
  "operation": "Assign Issues",
  "parameters": {
    "issues": "operation-1-output",
    "assignee": "John Doe"
  }
}

It would then be trivial for the software in question, in this a project management application, to parse and execute those operations on behalf of the user.

And all that's needed to unlock all this flexibility for the user is a chat-like interface that can be called up at any time. One UI component to unlock all that value for the highly-specific but high-value operations that users need to perform.

Other Benefits

But wait, there's more!

Data Generation

While transcribing user requests into operations that a SaaS application knows how to execute is helpful in that it unlocks a faster way for users to accomplish tasks, it can also be used to add generative features to the application. To continue with the project management example, the same chat-like interface could be used by the user to generate a set of new issues ("Create a set of issues that serve the overarching goal of increasing the number of users who sign up for our product"). The LLM can then generate the requested set of issues from scratch and natively add them to the application.

Analytics

An interface like the one described in this article would be most used to increase the efficiency of performing sets of tasks that would previously involved a lot of manual point-and-click work, or that wouldn't have been possible at all. As such, it would be an invaluable source of analytics for product teams. Just taking a look at the most common prompts would give product teams a good idea of what pain points users are experiencing, and what features they're looking for that the application lacks.

After all, a user is most likely to churn when they run into a task that they need to accomplish but can't, or that is so tedious that they're open to exploring other options that could help them do it faster. It's a no-brainer to both identify when this is happening and provide a quick solution in the form of an LLM-powered chat interface.

Implementation

Implementing the above is non-trivial, as you'll need to set up your integration with an LLM, figure out how to serialize and reverse serialize your application's operations and other relevant data into text, set up a chat-like UI for the user and tie it all together. It's certainly doable, but...

The Fast Way

At Rehance we've made it easy for you to cover your bases and retain your users. Define the available actions and context in our UI, and add our drop-in JS script to your site to get the chat interface up and running. Theme of the UI widget is customizable, and integration of the drop-in script should take a front-end engineer an hour or so depending on the number of operations you're looking to support. Check it out at rehance.ai.

The Slow Way

Here's the roadmap:

  1. Define a set of actions. Each action needs to have a set of parameters. Only include the bare necessities, as too much complexity will confuse the LLM you use.
  2. Define the shape of the context that may be needed to perform the provided actions. The details here depend on your use case, so you'll need to figure out what shape works best for you. As with the actions, minimize the number of fields and keep the data types as simple as possible.
  3. Create a chat-like UI interface for your users. They should be able to enter text describing what they're looking to accomplish.
  4. Construct a prompt for an LLM, providing the context and actions, along with the user request. Process the response, and convert it into actual operations performed in your software application. Handle errors and edge cases appropriately.
  5. Run analytics on your users' prompts to figure out what actions they use the chat tool for most - these are likely the most frustrating parts of using your application.

Building this type of interface into every piece of software is a no-brainer. Text inputs won't replace traditional UIs by any means, but they can certainly cover all of its blind spots.