Skip to main content

From Search Overload to Instant Answers: How askHE Transforms Product Documentation Access 

As health plans become increasingly reliant on complex software platforms to manage operations, timely access to accurate product information is essential. Yet traditional documentation search tools often fall short. Users face dozens of potentially relevant topics, forcing them to click through multiple pages and piece together answers manually—a time-consuming process that slows down critical operational workflows.

At HealthEdge®, we recognized this growing challenge and responded with askHE: an AI-powered documentation assistant that transforms how users interact with HealthRules® Payer product knowledge.

The Documentation Discovery Problem

Standard search functionality across HealthRules Payer help center modules works well for simple lookups, but breaks down when answers span multiple documentation sources. A claims processor investigating records and rule definitions might retrieve 15-20 relevant topics through fuzzy matching, then spend valuable minutes reading each one to find the specific information needed.

This scattered approach to information retrieval created operational friction. Staff needed a way to ask natural language questions and receive synthesized answers that pull context from across the entire documentation landscape, not just a list of potentially relevant pages.

Building the AI-Powered Answer Engine

Figure 1: Architecture: RAG System with Azure AI Search and OpenAI Integration 

The askHE implementation leverages a modern Retrieval-Augmented Generation (RAG) architecture built on Microsoft Azure infrastructure:

  • Vector-Powered Document Retrieval: Azure AI Cognitive Search stores document indexes and vector embeddings, connecting to Azure Blob Storage for efficient document retrieval.  When users submit questions, our RAG system identifies relevant documentation chunks through semantic similarity matching rather than simple keyword searches. Azure AI Cognitive Search stores document indexes and vector embeddings, connecting to Azure Blob Storage for efficient document retrieval.
  • Intelligent Response Synthesis: Retrieved documents feed into OpenAI’s GPT-3.5 model, which synthesizes information across multiple sources into coherent, conversational responses. Unlike traditional search that returns document lists, askHE processes content in the background and delivers ready-to-use answers.
  • Microservices Architecture: Azure Functions running Python 3 provide the application backbone. The microservice ingests requests through an API Gateway, transforms user queries to call the indexer for matching documents, and orchestrates OpenAI API calls to generate synthesized responses—all with minimal latency.
  • Seamless User Experience: Integration with MadCap Flare, the external documentation hosting platform, embeds askHE directly within the help center application. Users access the chatbot where they already work, eliminating context switching between search tools.
  • Citation-Driven Transparency: Every askHE response includes citation links to source topics, allowing users to verify information or explore concepts in greater depth. Users can click through to specific topic pages or navigate directly to relevant sections of product PDF guides.

Rapid Adoption Validates the Approach

 askHE Query Volume and User Engagement Trends

Figure 2: askHE Query Volume and User Engagement Trends 

The numbers tell a compelling story: 23,000 queries from 890 unique users in October alone demonstrate that askHE has become the go-to resource for anyone accessing HealthRules Payer help documentation, from claims processors and customer service representatives to providers and members seeking product information.

This adoption occurred organically, driven by the fundamental value proposition: instant, synthesized answers instead of manual documentation hunting. While formal time savings haven’t been quantified yet, user behavior speaks clearly—staff consistently choose askHE over traditional search when they need answers quickly.

The early metrics validate the strategic decision to invest in conversational AI for documentation access, positioning HealthEdge to scale this capability as user needs continue to evolve.

The Strategic Path Forward: Scaling RAG Beyond Documentation

The askHE success establishes a proven foundation for HealthEdge’s AI Platform, enabling rapid development of AI agents leveraging the RAG architecture. This same approach is being explored for care plan summaries that synthesize member health information and claims adjudication decision support that provides real-time guidance to processors by retrieving relevant policy documentation, medical necessity criteria, and historical precedents.

The askHE foundation creates opportunities for continuous improvement and measurement. Future iterations will focus on quantifying time savings per query, measuring user satisfaction through embedded feedback mechanisms, and tracking response accuracy to optimize the RAG pipeline.

RAG architecture’s flexibility allows for expansion—adding new documentation sources, fine-tuning retrieval algorithms, and potentially upgrading to more powerful language models as business needs and investment justification evolve. The vector database infrastructure scales efficiently, supporting growing documentation volumes without degrading performance.

By removing barriers to information access, HealthEdge is empowering users across the enterprise to make faster, more confident decisions. As we continue to expand AI capabilities across our product suite, we remain focused on practical solutions that reduce friction and improve user experience. To explore more about how HealthEdge is leveraging AI across the health plan lifecycle, visit our AI Resource Center or contact us for a personalized demo.

About the Author

Marcus Barlett is a machine learning engineer who thrives at the intersection of data, automation, and a good challenge. With a knack for turning messy datasets into sleek, intelligent tools, he’s helped streamline everything from solar performance models to healthcare contract extraction. Marcus enjoys streamlining workflows with automation and has a knack for turning technical challenges into practical solutions. Outside of work, he’s always exploring new ways to blend creativity with technology.