Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Introduction to Flink Web UI

Updated on 2024-11-29 GMT+08:00

Flink web UI provides a web-based visual development platform. You only need to compile SQL statements to develop jobs, slashing the job development threshold. In addition, the exposure of platform capabilities allows service personnel to compile SQL statements for job development to quickly respond to requirements, greatly reducing the Flink job development workload.

Flink Web UI Features

The Flink web UI has the following features:

  • Enterprise-class visual O&M: GUI-based O&M management, job monitoring, and standardization of Flink SQL statements for job development.

  • Quick cluster connection: After configuring the client and user credential key file, you can quickly access a cluster using the cluster connection function.
  • Quick data connection: You can access a component by configuring the data connection function. If Data Connection Type is set to HDFS, you need to create a cluster connection. If Authentication Mode is set to KERBEROS for other data connection types, you need to create a cluster connection. If Authentication Mode is set to SIMPLE, you do not need to create a cluster connection.
    NOTE:

    If Data Connection Type is set to Kafka, Authentication Type cannot be set to KERBEROS.

  • Visual development platform: The input/output mapping table can be customized to meet the requirements of different input sources and output destinations.
  • Easy to use GUI-based job management

Key Web UI Capabilities

Table 1 shows the key capabilities provided by Flink web UI.

Table 1 Key web UI capabilities

Item

Description

Batch-Stream convergence

  • Batch jobs and stream jobs can be processed with a unified set of Flink SQL statements.

Flink SQL kernel capabilities

  • Flink SQL supports customized window size, stream compute within 24 hours, and batch processing beyond 24 hours.
  • Flink SQL supports reading data from Kafka and HDFS, writing data to Kafka, Redis, and HDFS, and joining Redis dimension tables.
  • A job can define multiple Flink SQL jobs, and multiple metrics can be combined into one job for computing. If a job contains same primary keys as well as same inputs and outputs, the job supports the computing of multiple windows.
  • The AVG, SUM, COUNT, MAX, and MIN statistical methods are supported.

Flink SQL functions on the console

  • Cluster connection management allows you to configure clusters where services such as Kafka, Redis, and HDFS deployed.
  • Data connection management allows you to configure services such as Kafka, Redis, and HDFS.
  • Data table management allows you to define data tables accessed by SQL statements and generate DDL statements.
  • Flink SQL job definition allows you to verify, parse, optimize, convert a job into a Flink job, and submit the job for running based on the entered SQL statements.

Flink job visual management

  • Stream jobs and batch jobs can be defined in a visual manner.
  • Job resources, fault recovery policies, and checkpoint policies can be configured in a visual manner.
  • Status monitoring of stream and batch jobs are supported.
  • The Flink job O&M is enhanced, including redirection of the native monitoring page.

Performance and reliability

  • Stream processing supports 24-hour window aggregation computing and millisecond-level performance.
  • Batch processing supports 90-day window aggregation computing, which can be completed in minutes.
  • Invalid data of stream processing and batch processing can be filtered out.
  • When HDFS data is read, the data can be filtered based on the calculation period in advance.
  • Data in Flink jobs comes from Redis. If fault recovery policies have been set for Flink jobs, data is read from Redis during calculation and no data is lost when a job is faulty.
  • If the job definition platform is faulty or the service is degraded, jobs cannot be redefined, but the computing of existing jobs is not affected.
  • The automatic restart mechanism is provided for job failures. You can configure restart policies.

Flink Web UI Application Process

The Flink web UI application process is shown as follows:

Figure 1 Flink web UI application process
Table 2 Description of the Flink web UI application process

Step

Description

Reference

Creating an application

Applications can be used to isolate different upper-layer services.

Creating an Application

Creating a cluster connection

Different clusters can be accessed by configuring the cluster connection.

Creating a Cluster Connection

Creating a Data Connection

Through data connections, you can access different data services, including HDFS, Redis, and Kafka.

Creating a Data Connection

Creating a stream table

Data tables can be used to define basic attributes and parameters of source tables, dimension tables, and output tables.

Creating a Stream Table

Creating a SQL/JAR job (stream/batch job)

APIs can be used to define Flink jobs, including Flink SQL and Flink Jar jobs.

Creating a Job

Managing jobs

A created job can be managed, including starting, developing, stopping, deleting, and editing the job.

Creating a Job

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback