Configure the conversational bot settings

You can configure multiple conversational search bots in your environment. Once configured, you have the options to edit or delete your search bots from the Chatbot Configuration page.

To configure your conversational bot, do the following:

  1. In the SmartHub administration portal, under Conversational Search click Chatbot Configuration.

  2. On the Chatbot Configuration page, click Add to create a new chatbot or edit an existing chatbot.

  3. On the Conversational Bot Settings page, complete the following fields:

    Setting Description Default value
    Conversational Bot Name Enter a name for your conversational bot. Conversational Bot
    Conversational Bot Description Enter a description for your conversational bot. In this field, you can include things such as the usage for the particular search bot or other configuration specifics that you have specified. -
    Search Engines Select the search engines that you want to retrieve data from. No search engines are selected by default.
    Metadata Properties Enter a comma-separated list of chunk properties from which the AI will generate a response. ESC_ChunkBody,clickUri,title
    Maximum number of chunks to use Specify the maximum number of text fragments used to generate the answer 10
    Maximum number of proof documents to return Specify the maximum number of proof documents based on which the response was generated. 5
    Original Document Properties

    Enter a comma-separated list of original document properties returned for the proof documents.

    If you are using Smart Preview settings, you must ensure the following properties are requested from each document. The exact value of these properties may vary depending on your source system:
    • FileType
    • Size

    • Chunk_Body

    • ClickUri

    • Title

    clickUri,title,Rank
    Additional Query template constraints

    This defines the structured format SmartHub uses to translate conversational input into a backend search request. It can incorporate parameters produced by the query processor, such as rewritten keywords, entities, or filters, to apply intent-aware logic, enforce business rules, and improve retrieval precision.

    You can denote these parameters by including them inside curly "{}" brackets. For example:

    '{cleanSearchQuery} FederatorBackends:"{sources}" AND ({computedFilter})'

    -
    Maximum Output Tokens This setting controls the maximum number of tokens the LLM is allowed to generate in its response for a single conversational search interaction. If the model reaches this limit, it will stop generating additional text, even if more content could be produced. 8000
    Select an LLM Configuration This setting allows you to select a configured LLM provider from the drop-down list. -
    Select an Agent This setting allows you to select a configured agent group from the drop-down list. Agent Group Default
  4. After selecting an agent, the Smart Previews settings section appears. In the Smart Preview Settings section, you can specify settings for your document previews from your conversational search chats. These document previews will automatically open the document page that is most relevant to the user's question, based on AI-powered content analysis. SmartHub also leverages AI to extract the most important entities from the user's question and the AI's response, highlighting those entities in the preview window. You can provide values for the following Smart Preview settings fields:

    If you are using document previews, you must be using SmartHub 7.1 and Smart Previews 4.0.x or later. Additionally, you must add the necessary fields under Original Document Properties.
    Setting Description Default value
    Phrase match padding length This setting defines how much text, preceding or succeeding the key phrase, is included when extracting a chunk. This ensures more meaningful context when navigating to the relevant section during in- document search actions. 500
    Smart Previews Prompt Template This setting specifies the prompt template that is sent to the LLM. This template instructs the model on how to identify and extract the most relevant phrase from the provided content.

    "Given a Context, a Question, and an Answer generated by an LLM, do the following actions: Step 1: Extract the single most relevant substring from the Context (not the Answer) used to generate the Answer. Preserve all characters exactly, and encode line breaks as escaped \n characters (do not use actual line breaks). Step 2: Extract exactly ten unique, verbatim, three-word phrases from the Context. Step 3: From the given question and its answer, extract the top 5 most meaningful and specific terms that best represent their core purpose. Avoid vague or overly broad terms unless they are part of a named concept or hold technical significance. Order by their importance!"

    Smart Previews Input Format This setting specifies a specific JSON structure used to organize the context before sending it to the LLM. You must not modify the {ContentPlaceholder} tag, as it is essential for dynamic content insertion.
    {
        "max_completion_tokens": 8000,
        "messages": [
            {
                "content": "{ContextPlaceholder}",
                "role": "user"
            }
        ],
        "model": "gpt-4.1",
        "parallel_tool_calls": false,
        "temperature": 0,
        "tool_choice": "required",
        "tools": [
            {
                "function": {
                    "description": "{ContentPlaceholder}",
                    "name": "Orchestrator-Extract-Most-Relevant-Information",
                    "parameters": {
                        "properties": {
                            "most_important_entities": {
                                "description": 
                                "Top 5 most important entities extracted from the 
                                question and from answer in the row of their importance.",
                                "items": {
                                    "type": "string"
                                },
                                "type": "array"
                            },
                            "three_word_extractions": {
                                "description": "Three-word phrases from the context",
                                "items": {
                                    "type": "string"
                                },
                                "type": "array"
                            },
                            "top_relevant_phrase": {
                                "description": "The most relevant phrase from context
                                for the question and answer",
                                "type": "string"
                            }
                        },
                        "required": [
                            "top_relevant_phrase",
                            "three_word_extractions",
                            "most_important_entities"
                        ],
                        "type": "object"
                    },
                    "strict": false
                },
                "type": "function"
            }
        ]
    }
    Smart Previews Key Path This setting specifies the full path to the key in the JSON response from the LLM. This path is used to accurately extract the relevant information.
    choices[0].message.tool_calls[0].function.arguments
  5. Click Save.

  6. To delete a search bot configuration, click the delete icon.