Build an AI application using the Vord workflow.

The workflow example below is designed to automate customer service support. This system efficiently manages user queries by logically connecting nodes that classify, process, and respond to issues.

A workflow is a series of ordered tasks, activities, or steps designed to complete a specific business process or workflow. It describes the sequence of tasks, conditions, responsible parties, and other related information to ensure that work is carried out according to established processes and rules. Workflows often involve coordination and interaction between multiple participants and systems.

Node Types Used

  • Start Node

  • Question Classifier Node

  • Knowledge Retrieval Node

  • LLM Node


Steps


Step 1: Create an application

Click the "Create Application" button on the homepage to create an application. Fill in the application name, and select "Workflow" as the application type.

Step 2: Build a Workflow

  1. Start Node

Start Node

In Chatflow, the start node provides built-in system variables: sys.query and sys.files.

sys.query is used for user input questions in conversational applications.

sys.files is used for file uploads in conversations, such as uploading an image, which needs to be used in conjunction with an image understanding model.

  1. Question Classifier Node

In this scenario, we set up three classification labels/descriptions:

  • CLASS 1: Questions related to after-sales

  • CLASS 2: Questions about How to use products

  • CLASS 3: Other questions

Input Variables: This refers to the content that needs to be classified. Since this Customer Service is a type of Chatflow, the input variable will be sys.query, which represents the user's input or query within the context of a conversational application.

Output Variable:

class_name

This is the classification name output after classification. You can use the classification result variable in downstream nodes as needed.

  1. Knowledge Retrieval Node

Before we use this node, we will need to register some knowledge. Click here to learn How to create a knowledge base

The query variable typically corresponds to the user's input question, which is essential for retrieving relevant information from the knowledge base that aligns with the user's query. In this context, the query variable is class_name, which is the output of the question classifier node.

  1. LLM Node

Utilizes a large language model for answering questions or processing natural language.

Context variables are a special type of variable defined within the LLM node, used to insert externally retrieved text content into the prompt.

In this case, the downstream node of Knowledge Retrieval is the LLM node. The output variable result needs to be configured in the context variable within the LLM node for association and assignment. After association, inserting the context variable at the appropriate position in the prompt can incorporate the externally retrieved knowledge into the prompt.

Customer Support Workflow is a Chatflow so End nodes are not supported.

Step 3: Publish

Last updated