The Zyeta backend is a powerful, scalable system designed to provide AI-driven knowledge base and conversation capabilities. Built with modern asynchronous Python, the platform leverages FastAPI, PostgreSQL with vector extensions, and integrates with leading LLM providers.

Core Features

Knowledge Base Management

Store, index, and retrieve documents with semantic search capabilities

Conversational AI

Interact with LLMs through persistent conversation contexts

Multi-Organization Support

Isolated environments with role-based access control

Custom Agents

Configurable AI agents with specialized capabilities

Tool Integration

Extend agent capabilities with custom tools

LLM Provider Flexibility

Connect to various LLM providers (OpenAI, Anthropic, etc.)

System Architecture

Zyeta follows a modular, service-oriented architecture built with FastAPI:

Key Components

API Layer

The API layer provides RESTful endpoints for all platform functionality:

  • FastAPI Framework: Modern, high-performance web framework
  • Automatic Documentation: OpenAPI/Swagger docs for all endpoints
  • Type Validation: Pydantic models for request/response validation
  • Authentication: JWT-based auth with role-based access control

Service Layer

Services encapsulate business logic for specific domains:

  • Authentication Service: User management, login, registration
  • Knowledge Base Service: Document storage, indexing, and retrieval
  • Organization Service: Multi-tenant organization management
  • Conversation Service: LLM interaction and message history
  • Agent Service: Configurable AI agent management
  • Tool Service: Custom tool creation for agents

Data Layer

The data layer handles persistent storage and retrieval:

  • PostgreSQL: Primary relational database
  • pgvector Extension: Vector storage for embeddings
  • SQLAlchemy ORM: Async database access with transaction support
  • Alembic Migrations: Database schema version control

Integration Layer

The backend connects to various external services:

  • LLM Providers: OpenAI, Anthropic, and others
  • File Storage: S3-compatible storage for documents
  • Email Services: User notifications and invitations

Key Features In Depth

Knowledge Base System

The knowledge base system allows storage and retrieval of documents with semantic search:

1

Document Upload

Users upload documents in various formats (PDF, DOCX, TXT)

2

Text Extraction

Text is extracted from documents using appropriate parsers

3

Chunking

Documents are split into manageable chunks for embedding

4

Embedding Generation

LLM-based embeddings are created for each chunk

5

Vector Storage

Embeddings are stored in pgvector for efficient similarity search

6

Semantic Search

User queries are converted to embeddings and matched against stored vectors

Conversation System

The conversation system enables persistent interactions with LLMs:

  • Streaming Responses: Real-time streaming of LLM responses
  • History Management: Persistent conversation history
  • Context Window Management: Smart handling of token limits
  • RAG Integration: Augment conversations with knowledge base results

Organization & RBAC

The multi-organization system with role-based access control:

  • Organization Isolation: Data separation between organizations
  • Role Hierarchy: Configurable role levels (Owner, Admin, Member, Guest)
  • Fine-grained Permissions: Resource:action permission model
  • Invitation Flow: Email-based user invitations

Technology Stack

  • Language: Python 3.10+
  • Web Framework: FastAPI
  • Database: PostgreSQL 15+ with pgvector
  • ORM: SQLAlchemy 2.0 (async)
  • Authentication: JWT tokens
  • Validation: Pydantic
  • Testing: Pytest
  • Linting: Ruff, mypy
  • Deployment: Docker + Kubernetes

Getting Started

To get started with the Zyeta backend:

API Reference

For API documentation, see the API Reference section.

Was this page helpful?