Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
Help Center/ Data Ingestion Service/ Best Practices/ Collecting Incremental Log Data of Driving Behavior

Collecting Incremental Log Data of Driving Behavior

Updated on 2025-02-12 GMT+08:00

Scenario Overview

Data Ingestion Service (DIS) collects incremental log data of driving behavior, uploads it to Object Storage Service (OBS), and analyzes the uploaded log data through Data Lake Insight (DLI). The analysis results reflect driving behaviors and can be used by vehicle enterprises to provide value-added services such as driving habit optimization.

Figure 1 Service process diagram

The procedure is as follows:

  1. Creating an OBS Bucket
  2. Creating a DIS Stream
  3. Creating a Dump Task
  4. Obtaining Authentication Information
  5. Installing an Agent
  6. Preparing a Data Sample
  7. Configuring a DIS Agent
  8. Starting the DIS Agent
  9. Viewing Uploaded Files on OBS
  10. Creating a Database
  11. Creating an OBS Table
  12. Querying a Data Sample
  13. Querying Results

Creating an OBS Bucket

Create an OBS bucket for storing the data dumped by DIS. For details, see Creating a Bucket.

Creating a DIS Stream

Create a stream. For details, see Creating a DIS Stream.

Creating a Dump Task

  1. Log in to the DIS console.
  2. In the navigation tree, choose Stream Management.
  3. Click the stream created in Creating a DIS Stream. On the displayed page, click the Dump Management tab.
  4. Click Create Dump Task. On the Create Dump Task page, configure dump parameters.

    NOTE:
    • A maximum of five dump tasks can be created for each stream.
    • A dump task cannot be added to a stream whose Source Data Type is FILE.

  5. Click Create Now.

    Table 1 Dump task parameters

    Parameter

    Description

    Value

    Dump Destination

    Destination to which data is dumped.

    OBS: After the streaming data is stored to DIS, it is then periodically imported to OBS.

    After the real-time file data is stored to DIS, it is imported to OBS immediately.

    OBS

    Task Name

    Name of the dump task. The task name must be unique in a stream. A task name is 1 to 64 characters long. Only letters, digits, hyphens (-), and underscores (_) are allowed.

    -

    Dump File Format

    • text
    • csv
    • parquet
    • carbon

    Set this parameter as required.

    Dump Bucket

    Name of the OBS bucket used to store data from the DIS stream. The bucket name is created when you create a bucket on OBS.

    Name of the bucket to be created when Creating a DIS Stream.

    File Directory

    Directory created in OBS to store files from the DIS stream. Different directory levels are separated by slashes. The value cannot start with a slash.

    This directory name is 0 to 50 characters long.

    This parameter is left blank by default.

    -

    Time Directory Format

    Directory format based on the time. Data will be saved to the directory in the format of time layer under the dump file directory in the OBS bucket.

    For example, if the time directory is accurate to day, the data save path is "bucket name/dump file directory/year/month/day".

    Possible values:
    • N/A: If this parameter is left blank, the time directory format will not be used.
    • yyyy: year.
    • yyyy/MM: year and month.
    • yyyy/MM/dd: year, month, and day.
    • yyyy/MM/dd/HH: year, month, day, and hour.
    • yyyy/MM/dd/HH/mm: year, month, day, hour, and minute.

    You can only select but not enter a value in this field.

    -

    Record Delimiter

    Delimiter used to separate different dump records.

    Possible values:
    • Comma (,)
    • Semicolon (;)
    • Vertical bar (|)
    • Newline (\n)
    • NULL

    You can only select but not enter a value in this field.

    -

    Offset

    • Latest: Maximum offset, indicating that the latest data will be extracted.
    • Earliest: Minimum offset, indicating that the earliest data will be extracted.

    Latest

    Dump Interval (s)

    User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

    Value range: 30s to 900s

    Unit: second

    Default value: 300

    -

Obtaining Authentication Information

  • Obtaining an Access Key ID/Secret Access Key (AK/SK)
    To obtain an access key, perform the following steps:
    1. Log in to the management console, move the cursor to the username in the upper right corner, and select My Credentials from the drop-down list.
    2. On the My Credentials page, choose Access Keys, and click Create Access Key. See Figure 2.
      Figure 2 Clicking Create Access Key
    3. Click OK and save the access key file as prompted. The access key file will be saved to your browser's configured download location. Open the credentials.csv file to view Access Key Id and Secret Access Key.
      NOTE:
      • Only two access keys can be added for each user.
      • To ensure access key security, the access key is automatically downloaded only when it is generated for the first time and cannot be obtained from the management console later. Keep them properly.
  • Obtaining a project ID and account ID
    A project is a group of tenant resources, and an account ID corresponds to the current account. The IAM ID corresponds to the current user. You can view the project IDs, account IDs, and user IDs in different regions on the corresponding pages.
    1. Register with and log in to the management console.
    2. Hover the cursor on the username in the upper right corner and select My Credentials from the drop-down list.
    3. On the API Credentials page, obtain the account name, account ID, IAM username, and IAM user ID, and obtain the project and its ID from the project list.
  • Obtaining the endpoint

    An endpoint is the request address for calling an API. Endpoints vary depending on services and regions. You can obtain the endpoints of the service from Regions and Endpoints.

Installing an Agent

  1. Obtain the Agent installation package at https://dis-publish.obs-website.cn-north-1.myhuaweicloud.com/.
  2. Decompress the dis-agent-X.X.X.zip package to the current folder.

Preparing a Data Sample

  1. Obtain the sample data package.
  2. Decompress the package to the current folder.

Configuring a DIS Agent

  1. Use the file manager to access the conf directory of the DIS agent program, for example, C:\dis-agent-X.X.X\conf.
  2. Open the agent.yml file using an editor and modify parameter values in the file based on the site requirements.

    NOTE:
    • The configuration items and values must be separated by colons (:) and spaces.
    • The agent.yml file is in Linux format. You are advised to use Sublime Text to edit the file.
    Table 2 Parameters in the agent.yml file

    Configuration Item

    Mandatory

    Description

    Default Value

    region

    Yes

    Region where DIS is deployed.

    For details about how to obtain it, see Obtaining Authentication Information.

    -

    ak

    Yes

    User's AK.

    For details about how to obtain it, see Obtaining Authentication Information.

    -

    sk

    Yes

    User's SK.

    For details about how to obtain it, see Obtaining Authentication Information.

    -

    projectId

    Yes

    Project ID specific to your region.

    For details about how to obtain it, see Obtaining Authentication Information.

    -

    endpoint

    Yes

    DIS gateway address. Format: https://DIS endpoint

    For details about how to obtain it, see Obtaining Authentication Information.

    -

    body.serialize.type

    No

    Format of the DIS data packet to be uploaded. (non-raw data format)
    • json: The DIS data packet is encapsulated in format of JSON.
    • protobuf: The DIS data packet is encapsulated in binary format. After being encapsulated, the volume of the data packet is reduced by 1/3. This format is recommended when a massive amount of data is generated.

    json

    body.compress.enabled

    No

    Specifies whether to enable data compression.

    false

    body.compress.type

    No

    Data compression format selected when compression is enabled. Currently, the following compression formats are supported:

    lz4: a compression algorithm with a fast compression speed and high compression efficiency

    zstd: a new lossless compression algorithm with a fast compression speed and high compression ratio

    lz4

    PROXY_HOST

    No

    Proxy IP address. This parameter is mandatory when requests are sent through the proxy server.

    -

    PROXY_PORT

    No

    Proxy port.

    80

    PROXY_PROTOCOL

    No

    Proxy protocol. HTTP and HTTPS are supported.

    http

    PROXY_USERNAME

    No

    Proxy username.

    -

    PROXY_PASSWORD

    No

    Proxy password.

    -

    [flows]

    The [flows] section presents information about the files that will be uploaded to DIS.

    The following upload mode is supported:

    DISStream: DIS Agent monitors text files continuously, collects incremental data in real time, parses the data by delimiter, and uploads it to DIS streams (source data type: BLOB, JSON, and CSV). Table 3 describes configuration parameters.

    The agent.yml file provides example parameter settings.

    Table 3 DISStream configuration parameters

    Configuration Item

    Mandatory

    Description

    Default Value

    DISStream

    Yes

    Name of the DIS stream.

    Parses the file content matching filePattern by delimiter and uploads the file to the stream.

    -

    filePattern

    Yes

    File monitoring path. Files in only one directory can be monitored. Directories cannot be monitored recursively.

    To monitor multiple directories, configure multiple DIS streams in flows. The file names can be matched by asterisk (*)
    • /tmp/*.log: Matches all files whose names end with .log in the /tmp directory.
    • /tmp/access-*.log: Matches all files whose names start with access- and end with .log in the /tmp directory.
    • In Windows, the example path is D:\logs\*.log.

    -

    directoryRecursionEnabled

    No

    Specifies whether to search for a subdirectory. Possible values:

    • false: Not to search for subdirectories recursively and match only files in the root directory.
    • true: Search for all subdirectories recursively. For example, if filePattern is set to /tmp/*.log, /tmp/one.log, /tmp/child/two.log, and /tmp/child/child/three.log can be matched.

    false

    initialPosition

    No

    Initial position from which the file started to be monitored. Possible values:

    • END_OF_FILE: After monitoring starts, the system does not parse the files that match filePattern. Instead, the newly added file or file content will be parsed by delimiter and uploaded to DIS.
    • START_OF_FILE: All the files that match filePattern will be parsed by delimiter and uploaded to DIS based on the file modification time (from the earliest modified to the latest modified).

    START_OF_FILE

    maxBufferAgeMillis

    No

    The maximum time, in milliseconds, for which the agent buffers data before sending it to DIS.

    Unit: millisecond

    • If the record queue is full with data waiting to be uploaded, data will be immediately uploaded to DIS.
    • If the record queue is not full, files will be uploaded to DIS only after the specified period of time is reached.

    5000

    maxBufferSizeRecords

    No

    The maximum number of records for which the agent buffers data before sending it to DIS. If the number of records in a queue reaches the value, the data will be uploaded to DIS immediately.

    500

    partitionKeyOption

    No

    Method for generating the partition key. Each record carries a partition key. Records with the same partition key are allocated to the same partition. Possible values:
    • RANDOM_INT: The partition key is a random numeric string. Records with such a key are evenly distributed to each partition.
    • FILE_NAME: The partition key is a file name string. Records with such a key is distributed to a specific partition.
    • FILE_NAME,RANDOM_INT: The partition key is a combination of a file name string and a random numeric string, which are separated by comma (,). Records with such a key carries file names and are evenly distributed to all partitions.

    RANDOM_INT

    recordDelimiter

    No

    Delimiter used to separate records.

    Value range: any character that is enclosed in double quotation marks.

    The value cannot be empty. That is, this parameter cannot be set to "".

    NOTE:

    If the value is a special character, use a backslash (\) to escape. For example, if the value is a quotation mark ("), set this parameter to \". If the value is a backslash (\), set this parameter to \\.

    If the value is a control character, for example, STX, set this parameter to \u0002.

    "\n"

    isRemainRecordDelimiter

    No

    Specifies whether a delimiter is contained in records to be uploaded. Possible values:
    • true: The delimiter is contained in records to be uploaded.
    • false: The delimiter is not contained in records to be uploaded.

    false

    isFileAppendable

    No

    Specifies whether the file contains additional content. Possible values:

    • true: The file may contain additional content. Agent continuously monitors files. If content is added to a file, Agent parses the file by recordDelimiter and uploads records. In this case, ensure that the file ends with recordDelimiter. Otherwise, Agent considers that the content has not been added to the file and waits for recordDelimiter to be written.
    • false: The file will not contain additional content. If the last row of the file does not end with recordDelimiter, Agent still uploads the file as the last record. After the upload is complete, Agent will delete or rename the file based on the configuration of deletePolicy and fileSuffix.

    true

    maxFileCheckingMillis

    No

    Maximum time for checking file changes. If the file size, modification time, and file ID do not change within this period of time, a complete file is generated and starts to be uploaded.

    Set this parameter based on the actual file change frequency to prevent an incomplete file from being uploaded.

    If the file is changed after being uploaded, it will be fully uploaded again.

    Unit: millisecond

    NOTE:

    This parameter is available only when isFileAppendable is set to false.

    5000

    deletePolicy

    No

    Policy for deleting a file after the file content is uploaded. Possible values:
    • never: The file will not be deleted after the file content is uploaded.
    • immediate: The file will be deleted after the file content is uploaded.
      NOTE:

      This parameter is available only when isFileAppendable is set to false.

    never

    fileSuffix

    No

    Suffix of the file name that is added after the file content is uploaded.

    If the original file name is x.txt and fileSuffix is set to .COMPLETED, the name of the uploaded file is x.txt.COMPLETED.

    NOTE:

    This parameter is available only when isFileAppendable is set to false and deletePolicy is set to never.

    .COMPLETED

    sendingThreadSize

    No

    Number of the threads to send data. By default, a single thread is used to send data.

    NOTICE:

    If multiple threads are used, the following problems may occur:

    • Data may not be sent in order.
    • Some data is lost after the program stops abnormally and restarts.

    1

    fileEncoding

    No

    File encoding format. Possible values: UTF8, GBK, GB2312, and ISO-8859-1.

    UTF8

    resultLogLevel

    No

    Level of the calling result log generated after the DIS data sending API is called.

    • OFF: Each API calling result is not logged.
    • INFO: Each API calling result is logged at the INFO level.
    • WARN: Each API calling result is logged at the WARN level.
    • ERROR: Each API calling result is logged at the ERROR level.

    INFO

  3. (Optional) Use the Windows notepad to modify the agent.yml file and select UTF-8 when saving the file.

    1. Choose File > Save As.
    2. In the Save As dialog box, set Code to UTF-8.
    3. Click Save. The confirmation dialog box is displayed.
    4. Click Yes.

Starting the DIS Agent

  1. Use the file manager to access the bin directory of the DIS agent program, for example, C:\dis-agent-X.X.X\bin.
  2. Double-click the start-dis-agent.bat file. If the following information is displayed in the console window, the DIS agent is successfully started:

    [INFO ] (main) com.bigdata.dis.agent.Agent Agent: Startup completed in XXX ms.

    After the DIS agent is started, files are uploaded immediately and logs are continuously printed. If no ERROR log is found, files are uploaded without error.

    If logs are not printed within 30 seconds and the information similar to the following is displayed, the upload is completed.
    Agent: Progress: [0 records (0 bytes) / 10 files (32573229 bytes)] parsed, and [0  records / 10 files] sent successfully to destinations. Uptime: 30146ms

Viewing Uploaded Files on OBS

  1. Log in to the OBS console.
  2. In the navigation tree, choose Bucket List.
  3. In the Bucket Name column, click the bucket name configured in Creating a DIS Stream.
  4. On the displayed page, click the Object tab in the navigation tree to view the uploaded files.

Creating a Database

  1. On the homepage of the management console, choose Service List > Analytics > Data Lake Insight.
  2. Create a demo database. On the DLI console, click Create Job in the SQL Job area. The SQL job editor is displayed.
  3. On the left of the SQL job editor, click and then to create a database.

    NOTE:

    The default database is a built-in database. You cannot create a database named default.

Creating an OBS Table

  1. Choose the demo database and enter the following SQL statement in the editing area:

    create table demo.cars(
      NeutralSlideTime int,
      IsRapidlySlowdown int,
      DataTime STRING,
      Latitude STRING,
      IsOverspeedFinished int,
      IsACCOpen STRING,
      Direction STRING,
      IsOverspeed int,
      IsNeutralSlide int,
      IsOilLeak int,
      BaiDuLatitude STRING,
      OverspeedTime int,
      IsRapidlySpeedup int,
      DeviceID STRING,
      Mileage STRING,
      Longitude STRING,
      Velocity double,
      IsNeutralSlideFinished int,
      IsFatgueDriving int,
      Carnum STRING,
      BaiDuLongitude STRING,
      BaiDuAdress STRING,
      IsHthrottleStop int,
      ReceiveTime STRING,
      Altitude STRING
    ) USING csv OPTIONS (path "obs://......")
    NOTE:

    Change csv in the SQL statement to the format of the file to be dumped to the OBS bucket, and change the OBS path to the actual path for storing data.

  2. Click Execute to create a table.

    Figure 3 Creating a table

    The fields in the table are described as follows:

    Column Name

    Data Type

    Description

    DeviceID

    string

    Device ID.

    DataTime

    string

    Data time.

    ReceiveTime

    string

    Time of receipt.

    IsACCOpen

    boolean

    Whether to enable ACC.

    Longitude

    double

    Longitude.

    Latitude

    double

    Latitude.

    Velocity

    int

    Velocity.

    Direction

    string

    Direction.

    Altitude

    string

    Altitude.

    Mileage

    long

    Mileage.

    BaiDuLongitude

    double

    Longitude of the Baidu map.

    BaiDuLatitude

    double

    Longitude of the Baidu map.

    BaiDuAdress

    string

    Address of the Baidu map.

    Carnum

    string

    License plate number.

    IsRapidlySpeedup

    int

    Rapid acceleration.

    IsRapidlySlowdown

    int

    Rapid deceleration.

    IsNeutralSlide

    int

    Neutral taxiing.

    IsNeutralSlideFinished

    int

    Neutral taxiing finished.

    NeutralSlideTime

    int

    Time length of neutral taxiing. Unit: second.

    IsOverspeed

    int

    Overspeed.

    IsOverspeedFinished

    int

    Overspeed finished.

    OverspeedTime

    int

    Overspeed time length. Unit: second.

    IsFatgueDriving

    int

    Fatigue driving.

    IsHthrottleStop

    int

    Stepping on the gas pedal when the vehicle is not moving.

Querying a Data Sample

  • Acceleration statistics
    select
      Carnum,
      day,
      IFNULL(sum(isRapidlySpeedup), 0) as rapidlySpeedupTimes
    from
      (
        select
          *,
          cast(DataTime as date) as day
        from
          demo.cars
      ) t1
    group by
      Carnum,
      day

Querying Results

Figure 4 shows the execution results of the query statements.
Figure 4 Speeding statistics

Click Graphical Result to display the query result in a graph. Set Chart Type, X Axis, and Y Axis. Then the speeding statistics diagram is displayed, as shown in Figure 5.

NOTE:
  • If no column of the numeric type is displayed in the execution result, the result cannot be represented in charts.
  • The chart types include the bar chart, line chart, and fan chart.
  • In the bar chart and line chart, the X axis can be any column, while the Y axis supports only columns of the numeric type. The fan chart displays the corresponding legends and indicators.
Figure 5 Speeding statistics

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback