Skip to main content

AI Services module

Contextual chatbot, AI-powered insights, mentor matching, and notification digests — backed by your choice of LLM provider.

Optional module Enable in Settings → Modules

AI Services adds a contextual chat assistant that can answer questions about your org's data, plus AI-powered analytics for org admins.

What this module enables

SurfaceWhat it does
Floating chatbot widgetBottom-right of every page; ask questions in natural language
AI insightsOrg admins get periodic summaries (e.g., "5 chapters at risk of falling behind on dues")
Document Q&AAsk questions of your Documents library
AI mentorship pairing(Optional) AI suggests mentor-mentee matches based on profiles

How the chatbot works

  1. The user asks a question.
  2. GreekManage embeds the question and matches it against your org's data (members, documents, events, etc.).
  3. Relevant context plus the question is sent to a language model (Anthropic Claude, OpenAI, or Google Gemini — configurable).
  4. The model returns an answer, citing sources.
  5. The response streams back to the user.

This is retrieval-augmented generation (RAG). The model never sees data outside what's retrieved for the specific question.

Privacy boundaries

  • Per-user scoping. A member can't get answers about another chapter's roster — the retrieval respects their access.
  • No cross-tenant access. Org A's chatbot never sees Org B's data.
  • Logging — conversations are logged for quality improvement; org admins can disable per-org.
  • PII handling — the model receives only what's needed for the answer; sensitive fields (SSN, payment cards) are never sent.

Provider configuration

Org admins (or platform admins, depending on your setup) configure:

  • Provider: Anthropic Claude, OpenAI, Google Gemini
  • Model: provider-specific (e.g., Claude Sonnet, GPT-4, Gemini Pro)
  • Knowledge scope: which data sources the chatbot can retrieve from (members, events, documents, invoices, etc.)
  • Logging: on or off
  • BYOK (bring your own key): paste your provider API key and pay the provider directly, or use the platform-managed key

Cost model

If you're using a platform-managed key, AI usage is billed via your subscription tier with a monthly query cap. If you use BYOK, you pay the provider directly.

Track usage in Org → Settings → AI → Usage.

Walkthroughs

ForPage
Members using the chatbotUsing the AI chatbot

Use cases

  • "When does the next chapter meeting start?" — pulls from events
  • "How much do I owe?" — pulls from invoices
  • "What's the dress code for formal?" — pulls from event description and documents
  • "Where can I find the bylaws?" — links to the document
  • "Summarize last week's Engage discussions" — summarizes for org admins

Where AI can fail

  • Hallucination — the model may confidently state something incorrect. Always verify financial or legal answers.
  • Stale data — embeddings are refreshed on a schedule; very recent changes may not be retrievable for a few minutes.
  • Privacy mistakes — if your data has been entered into the wrong access scope, the AI may surface it. Audit your access scopes.

Dependencies

  • All other modules the AI is allowed to read from
  • A configured provider key (BYOK or platform-managed)
  • Storage — for embeddings (the platform manages this)

Pricing

AI Services is typically a paid add-on. Pricing depends on provider, model, and query volume. Confirm with your platform admin.

  • Documents — chatbot indexes documents for retrieval
  • Operations — chatbot can answer dues, election, compliance questions
  • Community — chatbot can summarize forum threads