Esta página ainda não está disponível no idioma selecionado. Estamos trabalhando para adicionar mais opções de idiomas. Agradecemos sua compreensão.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
Help Center/ Data Lake Insight/ Best Practices/ Connecting BI Tools to DLI for Data Analysis/ Configuring DBT to Connect to DLI for Data Scheduling and Analysis

Configuring DBT to Connect to DLI for Data Scheduling and Analysis

Updated on 2025-02-26 GMT+08:00

Data Build Tool (DBT) is an open source data modeling and conversion tool that runs in Python environments. Connecting DBT to DLI can define and execute SQL transformations, supporting the entire data lifecycle management from integration to analysis. It is suitable for large-scale data analysis projects and complex data analysis scenarios.

This section describes how to configure DBT to connect to DLI.

Preparations

  • Environment requirements

    Make sure that your system environment meets the following requirements:

    • Operating system: Windows or Linux
    • Ensure that Python is installed as DBT is Python-based.

      Python version: Python 3.8 or later. Python 3.8 is recommended.

  • Obtaining the dli-dbt driver package

    Download the JDBC driver huaweicloud-dli-jdbc-xxx-dependencies.jar from the DLI management console.

  • Connection information:
    Table 1 Connection information

    Item

    Description

    How to Obtain

    DLI's AK/SK

    AK/SK-based authentication refers to the use of an AK/SK pair to sign requests for identity authentication.

    Obtaining an AK/SK

    DLI's endpoint address

    Endpoint of a cloud service in a region.

    Obtaining an Endpoint

    DLI's project ID

    Project ID, which is used for resource isolation.

    Obtaining a Project ID

    DLI's region information

    DLI's region information

    Regions and Endpoints

Step 1: Create a DBT Environment

  1. Install dbt-core.

    Install dbt-core of the recommended version.

    pip install dbt-core==1.7.9
    NOTE:

    pip is a package management tool for Python that is typically installed alongside Python.

    If pip is not installed, install it using Python's built-in ensurepip module.

    python -m ensurepip

  2. Install dli-sdk-python.

    Run the following installation command:

    python setup.py install
  3. Installing dli-dbt

    Download the dli-dbt driver from the DLI management console.

    Run the following installation command:

    python setup.py install

    Run the following command to check whether dbt is successfully installed:

    dbt --version

Step 2: Connect DBT to DLI

Configure the profiles.yml file to store information about the connection between DBT and DLI.

Find .dbt in the home directory of the server where DBT is installed and create or edit the profiles.yml file.

For example, in Windows, the path may be C:\Users\Username\.dbt\profiles.yml.

The file must contain the configuration of the connection between DBT and DLI. For example:

profiles:
  - name: dbt_dli
    target: dev
    outputs:
      dev:
        type: dli
        region: your-region-name
        project_id: your-project_id
        access_id: your-ak
        secret_key: your-sk
        queue: your-queue-name
        database: your-dli-database
        schema: your-dli-schema
Table 2 Parameters for connecting DBT to DLI

Parameter

Mandatory

Description

Example Value

type

Yes

Data source type. Set it to dli in this example.

dli

region

Yes

Region name.

ap-southeast-2

project_id

Yes

ID of the project where DLI resources are.

0b33ea2a7e0010802fe4c009bb05076d

access_id and secret_key

Yes

AK/SK that acts as the authentication key.

-

queue

Yes

DLI queue name.

dli_test

database

Yes

Data directory name, with dli as default.

If LakeFormation metadata is used, enter the data directory name.

dli

schema

Yes

Name of the DLI database used to submit jobs.

tpch

Step 3: Use DBT to Submit a Job to DLI

  1. Initialize a DBT project.

    Run the following command in an empty directory to initialize a DBT project:

    dbt init
  2. Configure the dbt_project.yml file.

    Create or edit the dbt_project.yml file in the root directory of the project.

    Configure the project by referring to dbt_project.yml.

    Ensure that the data source name defined in profiles.yml of the project has been set in the profile file in Step 2: Connect DBT to DLI.

    Figure 1 profile file
    Figure 2 profile configured in the dbt_project.yml file
  3. Verify the configuration.

    Run the following command to check whether the DBT configuration is correct:

    dbt debug
  4. Run the job.
    Once the test is passed, run the following command to execute your data model:
    dbt run

Usamos cookies para aprimorar nosso site e sua experiência. Ao continuar a navegar em nosso site, você aceita nossa política de cookies. Saiba mais

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback