AI-powered search provider Sinequa announced today the availability of Sinequa Assistants, new enterprise-grade generative AI assistants that seamlessly integrate with all enterprise content and applications to augment and transform knowledge work.
Sinequa’s new generative AI Assistants empower employees to work more efficiently, effectively and creatively, allowing them to accelerate and improve decisions, devote more time to strategic initiatives and drive business success.
Generative AI (GenAI) is poised for rapid adoption. According to Gartner, “By 2027, GenAI will augment 30% of all knowledge workers’ tasks, from 0% in 2023.” (1) As organizations have grappled with how to reliably use generative AI in business, Sinequa’s Assistants demonstrate that combining search with GenAI in a technique called retrieval-augmented generation (RAG) enables the next generation of AI-augmented knowledge work. Sinequa’s unique Neural Search is the perfect complement to GenAI and provides the foundation for Sinequa’s Assistants. Its capabilities go far beyond RAG’s conventional search-and-summarize paradigm to intelligently execute complex, multi-step activities, all grounded in facts to augment the way employees work. Assistants automate tedious tasks and streamline work processes with added insights for better results, improved quality, and a more satisfying work experience.
Unlike other AI assistants or copilots that lack secure access to the full scope of all enterprise content, Sinequa’s Assistants leverage any and all company content and knowledge to generate contextually-relevant insights and recommendations, while ensuring privacy and data governance. Optimized for scale with three custom-trained small language models (SLMs) for the best relevance at low cost, Sinequa Assistants ensure accurate conversational responses on any internal topic, complete with citations and full traceability to the original source.
Sinequa Assistants work with any public or private generative LLM, including Cohere, OpenAI, Google Gemini, Microsoft Azure Open AI, Mistral and others, allowing companies to choose which LLMs best meet their needs while controlling costs. The Sinequa Assistant framework powers a range of ready-to-go Assistants along with tools to define custom Assistant workflows so that customers can use an Assistant out of the box, or tailor and manage multiple Assistants from a single platform. These Assistants can be tailored to fit the needs of specific business scenarios and deployed and updated quickly without code or additional infrastructure. Some of the domain-specific Assistants available include:
• The Augmented Scientist empowers research teams to converse with scientific content from an ever-increasing number of data sources to speed up clinical trials, drug development and streamline Research & Development (R&D) processes.
• The Augmented Engineer empowers design teams with a unified view of projects, products, and parts and the ability to construct and search across a digital thread.
• The Augmented Lawyer gives lawyers and paralegals powerful self-service research capabilities across all case files and information through a time-saving AI search.
• The Augmented Asset Manager empowers financial asset managers and advisors to leverage valuable insight from contracts, portfolio history and documents.
All of the Assistants are powered by Sinequa’s search platform, combining the most accurate and performant hybrid search technology and three custom SLMs. A flexible Assistant framework makes it easy to define custom workflows informed by search and combine that with any generative AI model, whether commercial, private, or open source.
“To better capitalize on the feedback following production incidents in our refineries, we implemented JAFAR (Jenerative AI for Availability REX), a new search app designed to streamline information retrieval in TotalEnergies’ knowledge databases. Powered by Sinequa’s search engine/RAG combined with generative AI, JAFAR enhances decision-making by analyzing documents and providing recommendations,” said Aude Giraudel, Head of Smart Search Engines, TotalEnergies. (2)
While most companies are still experimenting with basic search-and-summarize RAG, Sinequa has gone much further with Assistants that take RAG to the next level. Sinequa’s Assistants execute multi-step workflows to accomplish complex tasks with reasoning, incorporating RAG as needed to fully leverage corporate knowledge. This ensures that the Assistant responds accurately, transparently, and securely, with the most up-to-date information, including inline citations to original sources and immediate traceability - and access - to those sources.
“Sinequa has been investing in Artificial Intelligence and leading Enterprise Search for the last 20 years thanks to our unmatched connectivity and scalability on enterprise content. Having pioneered the use of LLMs in search with our custom trained SLMs, Sinequa is perfectly positioned to take search and RAG to the next level with the release of Sinequa’s Generative AI Assistants. This is an exciting time as our Assistants augment the way we work, empowering knowledge workers to deliver higher quality work in less time, and with greater confidence and lower mental stress. At Sinequa we are proud to support our customers in their GenAI transformation,” said Jean Ferré, CEO and Co-Founder of Sinequa.
As part of this paradigm shift and the unveiling of Sinequa’s virtual assistants, Sinequa has also launched a new brand identity and company website to reflect this new era of work augmented by Assistants informed by search and Sinequa’s role in advancing AI in the enterprise. AI is fundamentally transforming how work gets done, and represents a new world of opportunity for augmenting employees and businesses.
By on Tue, 04 Jun 2024 09:50:00 GMT
Original link