Generative AI has become a driving force in the business world, and organizations are eagerly trying to leverage its benefits while mitigating the associated risks. To address this, IBM Watson Assistant has introduced Conversational Search, powered by the Granite large language model and Watson Discovery. This beta release enables AI Assistants to deliver faster and more accurate answers by scaling conversational interactions grounded in business content.
Paving the way: Large language models
The focus in generative AI has been on Large Language Models (LLMs), which are revolutionizing knowledge discovery and interaction. Traditionally, enterprises relied on keyword-based search engines to support customers and employees. However, the introduction of chatbots brought the need for answering questions without a predefined path or answer. IBM Watson Assistant has been successfully meeting this need for several years. With LLMs and generative AI, IBM is taking this capability even further.
Introducing Conversational Search for Watson Assistant
IBM is pleased to announce the beta release of Conversational Search for Watson Assistant. This feature combines the power of the Granite LLM model and Watson Discovery to provide comprehensive and contextually grounded answers, enabling outcome-oriented interactions. Conversational Search seamlessly integrates with the augmented conversation builder, allowing customers and employees to automate answers and actions.
Users of the Plus or Enterprise plans of Watson Assistant can now request early access to Conversational Search.
How does Conversational Search work behind the scenes?
When a user asks a question, Watson Assistant determines the best way to help – whether to trigger a prebuilt conversation, employ conversational search, or escalate to a human agent. The new transformer model in Watson Assistant achieves higher accuracy with less training needed. Conversational Search relies on two key steps: retrieval and generation. Retrieval involves finding the most relevant information using search capabilities from Watson Discovery, while generation structures the information to generate a conversational response. By leveraging the Retrieval Augmented Generation framework, Watson Assistant minimizes the need to retrain the LLM model.
Conversational AI that drives open innovation
IBM Watson Assistant Conversational Search is a flexible platform that delivers accurate answers across various channels and touchpoints. IBM offers deployment options on IBM Cloud as well as a self-managed Cloud Pak for Data deployment option for semantic search with Watson Discovery. Semantic search will be available as a configurable option for Conversational Search in the future. Additionally, organizations can bring their proprietary data to customize Watson LLM models through watsonx.ai, or leverage third-party models for conversational search and other use cases.
Conversational Search in action
Let’s look at a real-life scenario and how Watson Assistant leverages Conversational Search to help a customer of a bank apply for a credit card. The assistant can seamlessly extract information from the user’s messages to gather the necessary details, call the appropriate backend service, and return the offer details back to the user. If there are no suitable pre-built conversations, Conversational Search looks through the bank’s knowledge documents and answers the user’s question. If a special topic is recognized, the assistant will escalate to a human agent.
FAQ
What is Conversational Search?
Conversational Search is a feature in IBM Watson Assistant that utilizes the Granite large language model and Watson Discovery to provide fast and accurate answers to user queries in a conversational manner.
How does Conversational Search work?
Conversational Search involves two steps: retrieval and generation. Retrieval involves finding relevant information using search capabilities from Watson Discovery, while generation structures the information to generate a conversational response using the Granite LLM model. The aim is to deliver rich and contextually grounded answers.
Can Conversational Search be customized?
Yes, organizations can bring their proprietary data to customize Watson LLM models using watsonx.ai. Additionally, third-party models from the Hugging Face community can be leveraged for conversational search and other use cases.
How can I access Conversational Search?
Users of the Plus or Enterprise plans of Watson Assistant can request early access to Conversational Search. They can contact their IBM Representative to gain exclusive access or schedule a demo with an expert.
Will semantic search be available for Conversational Search?
Yes, semantic search will be available as a configurable option for Conversational Search in the future, allowing enterprises to run and deploy the feature according to their specific needs and preferences.
Is Watson Assistant committed to responsible AI usage?
IBM understands the importance of using AI responsibly. Watson Assistant allows organizations to enable Conversational Search selectively based on recognized topics and offers trigger words to automatically escalate to a human agent when necessary to ensure responsible and effective usage.
More in this category ...
Data Monetization Strategies: Unleashing the Potential of Your Data Assets
Successful Beta Service launch of SOMESING, ‘My Hand-Carry Studio Karaoke App’

Coinbase unveils global, instant money transfers via popular messaging and social platforms
Decentralized Identity Management: The Power of Blockchain in Government
BitMEX Collaborates with PowerTrade to Introduce New Crypto Products for Traders
Reskilling your workforce in the time of AI
Assemblyman Proposes Bill to Regulate Digital Assets as Securities
ORDI worth hits new all-time top as Bitcoin touches $42k
Societe Generale Launches Inaugural Digital Green Bond on Ethereum Blockchain
Bitcoin skyrockets to $44,000 as bulls brush bears apart
DWF Labs Invests Additional $1.25M in FLOKI to Support the Ecosystem
TokenFi (TOKEN) worth is up 48% as of late: Here’s why
Retailers can faucet into generative Computational Intelligence to beef up reinforce for patrons and staff
Record-Breaking Inflows in Crypto Investment Products Echo 2021 Bull Run

Big Data and Analytics: Driving Efficiency in the Digital Supply Chain
Jellyverse secures $2 million seed round to build DeFi 3.0
A guide to efficient Oracle implementation
From Fiat to Crypto: Exploring the Role of Regulated Exchanges in Digital Asset Adoption
Top crypto picks to buy at rising market before it’s too late
Core Scientific explains its latest bankruptcy plan ahead of court date

Enhancing Privacy with Zero-Knowledge Proofs: The Power of Privacy-Focused Blockchains
Riot purchases BTC miners worth $290M from MicroBT
The Importance of Supply Chain Optimization in Today’s Business Environment
Standard Chartered Zodia integrates Ripple-owned Metaco’s crypto storage services
Web 3.0: The Internet of Value and Smart Contracts
Crypto Executives Predict Bull Run for Bitcoin in 2024, Others Disagree
Comparing Traditional and Decentralized Storage: What You Need to Know
Empowering Security Analysts: Strategies to Maximize Productivity and Efficiency
Bitcoin tops $40K for first time in 19 months, Matrixport tips $125K in 2024
