Updated on 2026-01-04 GMT+08:00

Overview

As generative AI and intelligent analytics technologies are widely used, DWS AI is also evolving. From version 9.1.1.200, DWS supports AI functions such as vector computing, in-database inference, and MCP Server. DWS AI integrates data analytics, semantic search, and intelligent prediction into one platform, achieving data intelligence. It supports the entire process from data mining and parsing to intelligent inference and search, and helps enterprises maximize data value and make better real-time decisions at lower costs to drive business growth and meet diversified requirements in AI scenarios.

  • Vector computing

    DWS integrates the pgvector plugin to provide storage and retrieval of high-dimensional vectors. Vector similarity search (such as cosine, L2, and inner product) is supported in databases. Mainstream efficient index structures such as IVFFlat and HNSW are supported. They greatly improve the query performance for a large number of vectors. Using the in-database inference capability of DWS, you can enable AI applications such as semantic search, recommendation, and retrieval-augmented generation (RAG) in databases.

  • In-database inference

    DWS integrates the pgai plug-in. You can call large language models (LLMs) and embedding models in databases without depending on external AI platforms. This simplifies the RAG application process and improves the response speed and database security, providing more efficient and flexible data analysis experience for you.

  • MCP Server

    The DWS MCP Server is implemented based on the Model Context Protocol (MCP). It enables AI models and agents to seamlessly connect to DWS clusters and provides functions such as statement execution and metadata query. With large models, users can directly operate databases using natural languages. The system converts natural languages into SQL statements, which are used to query data. Then users can receive query results and process data flexibly and intelligently.

  • Feature operators of large models

    Feature operators of large models are encapsulated into an extension file stored in the DWS system. An extension can be created using the CREATE EXTENSION command. Large models invoke the encapsulated Python UDFs to process and analyze data intelligently.