Esta página ainda não está disponível no idioma selecionado. Estamos trabalhando para adicionar mais opções de idiomas. Agradecemos sua compreensão.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
Help Center/ GaussDB(DWS)/ Tool Guide/ Data Studio/ Configuring Data Studio

Configuring Data Studio

Updated on 2025-01-09 GMT+08:00

This section describes the configuration steps to use Data Studio. It also explains the steps to configure servers for debugging PL/SQL Functions.

Configuring Data Studio

Steps to configure Data Studio using Data Studio.ini file:

NOTE:

Restart Data Studio to view parameter changes. Invalid parameters added in the configuration file are ignored by Data Studio.

The following table lists the configuration parameters used in Data Studio.

Table 1 Configuration parameters

Parameter

Description

Value Range

Default Value

-startup

Defines the JAR files required to load Data Studio. This information varies based on the version used.

N/A

plugins/org.eclipse.equinox.launcher_1.3.100.v20150511-1540.jar

--launcher.library

Specifies the library required for loading Data Studio. The library varies depending on the Data Studio version.

N/A

plugins/org.eclipse.equinox.launcher.win32.win32.x86_1.1.300.v20150602-1417 or plugins/org.eclipse.equinox.launcher.win32.win32.x86_64_1.1.300.v20150602-1417 depending on the installation package used

-clearPersistedState

Removes any cached content on the GUI and reloads Data Studio.

N/A

N/A

NOTE:

You are advised to add this parameter.

-consoleLineCount

Defines the maximum number of lines to be displayed in the Messages window.

1-5000

1000

-logfolder

Used to create the log folder. The user can specify the path to save logs. If the default value "." is used, then the folder is created in Data Studio\UserData\<user name>\logs. For details, see Setting the Location for Creating Log Files.

N/A

-

-loginTimeout

Defines the connection wait time in seconds. Within the period specified by this parameter, Data Studio continuously attempts to connect to the database. If the connection times out, the system displays a message indicating that the connection times out or the connection fails.

N/A

180

-data

Defines the instance data location for the session.

N/A

@none

@user.home/MyAppWorkspace

Eclipse workspace is created in this location while Data Studio is being launched.

@user.home refers to C:/Users/<username>

Eclipse log files are available in @user.home/MyAppWorkspace/.metadata

N/A

N/A

-detailLogging

Defines the criteria with reference to logging error messages.

Set to True to log all error messages.

Set to False to log only error messages explicitly specified by Data Studio.

Refer to Controlling Exception and Error Logs for more information.

This parameter is not added by default and it can be set manually if logging is required.

True/False

False

-logginglevel

Creates the log files based on the value specified. If the value provided is arbitrary or empty, log files will be created using WARN value. For details, see Different Types of Log Levels.

This parameter is not added by default and it can be set manually if logging is required.

FATAL, ERROR, WARN, INFO, DEBUG TRACE, ALL, and OFF

WARN

-focusOnFirstResult

Defines auto focus behavior for Result window.

Set to false to automatically set focus to the last opened Result window.

Set to true to disable the automatic set focus.

True/False

False

NOTE:
  • All the above parameters must be added before -vmargs.
  • -startup and --launcher.library must be added as first and second parameter respectively.

-vmargs

Specifies the start of virtual machine arguments.

NOTE:

-vmargs must be the last parameter in the configuration file.

N/A

N/A

-vm

<file name (javaw.exe) with relative path to Java executable>

Specifies the file name, for example, javaw.exe, and the relative path to Java.

N/A

N/A

-Dosgi.requiredJavaVersion

Defines the minimum java version required to run Data Studio. This value must not be modified.

N/A

1.5

NOTE:

Note: Recommended Java version is 1.8.0_141

-Xms

Defines the initial heap space that Data Studio consumes. This value must be in multiples of 1024 and greater than 40 MB and less than or equal to -Xmx size. Append the letter k or K to indicate kilobytes, m or M to indicate megabytes, g or G to indicate gigabytes. For example:

-Xms40m

-Xms120m

Refer to Java documentation for more information.

N/A

-Xms40m

-Xmx

Defines the maximum heap space that Data Studio consumes. This value can be modified based on the available RAM space. Append the letter k or K to indicate kilobytes, m or M to indicate megabytes, g or G to indicate gigabytes. For example:

-Xmx1200m

-Xmx1000m

Refer to Java documentation for more information.

N/A

-Xmx1200m

-OLTPVersionOldST

Used to configure the earlier OLTP versions. You can log in to gsql and run SELECT VERSION() to update the OLTPVersionOldST parameter in the .ini file using the obtained version number.

-

-

-OLTPVersionNewST

Used to configure the latest OLTP version. You can log in to gsql and run SELECT VERSION() to update the OLTPVersionNewST parameter in the .ini file using the obtained version number.

-

-

-testability

This parameter is used to enable testability features. For the current version after this function is enabled:

  • The user can copy content of last triggered auto-suggest operation using the Ctrl+Space shortcut key.
  • When Include Analyze is selected, Execution Plan and Cost is displayed in tree and graphical view.

This parameter is available by default and needs to be added manually for testing.

True/False

False

-Duser.language

Defines the language settings for Data Studio. This parameter is added after the language setting is changed.

zh/en

N/A

-Duser.country

Specifies the country/region settings of Data Studio. This parameter is added after the language setting is changed.

CN/IN

N/A

-Dorg.osgi.framework.bundle.parent=ext

This parameter specifies which class loader is used for boot delegation.

boot/app/ext

boot

-Dosgi.framework.extensions=org.eclipse.fx.osgi

This parameter is used to specify a list of framework extension names. Framework extension bundles are fragments of the system bundle (org.eclipse.osgi). As a fragment, user can provide extra classes with the framework to use.

N/A

N/A

NOTE:
  • You are not allowed to modify the following settings:

    Dorg.osgi.framework.bundle.parent=ext

    Dosgi.framework.extensions=org.eclipse.fx.osgi

  • If you receive the message SocketException: Bad Address: Connect:

    Check whether the client is connected to the server using the IPv6 or IPv4 protocol. You can also establish the connection by configuring the following parameters in the .ini file:

    -Djava.net.preferIPv4Stack=true

    -Djava.net.preferIPv6Stack=false

    Table 2 lists the supported communication scenarios.

    The first row and first column indicate the types of nodes that attempt to communicate with each other. x indicates that the nodes can communicate with each other.

Table 2 Communication scenarios

Node

V4 Only

V4/V6

V6 Only

V4 only

x

x

No communication possible

V4/V6

x

x

x

V6 only

No communication possible

x

x

Setting the Location for Creating Log Files

  1. Open the Data Studio.ini file.
  2. Provide the path for the -logfolder parameter.

    For example:

    -logfolder=c:\test1

    In this case, the Data Studio.log file is created in the c:\test1\<user name>\logs path.

    NOTE:

    If any of the users does not have access to the path mentioned in the Data Studio.ini file, then Data Studio closes with the below pop-up message.

The Data Studio.log file will be created in the Data Studio\UserData\Username\logs path if:

  • The path is not provided in the Data Studio.ini file.

    For example: -logfolder=.

  • The path provided does not exist.
NOTE:

Refer to the server manual for detailed information.

You can use any text editor to open and view the Data Studio.log file.

Controlling Exception and Error Logs

The stack running details of exception, error or throw-able are controlled based on the program argument parameter. This parameter is configured in the Data Studio.ini file.

-detailLogging=false

If the value of -detailLogging is set to True, errors, exceptions, or stack running details of throwables will be logged.

If the value of -detailLogging is set to False, errors, exceptions, or stack running details of throwables will not be logged.

Description of the Log Message

The log message is described as follows:

When the size of the Data Studio.log file reaches 10,000 KB (the maximum value), the system automatically creates a file and saves it as Data Studio.log.1. Logs in Data Studio.log are stored in Data Studio.log.1. When the size of the Data Studio.log file reaches the maximum again, the system will automatically create a file and save it as Data Studio.log.2. Latest logs are always written in the Data Studio.log file. This process continues till Data Studio.log.5 reaches the maximum file size and the cycle restarts. Data Studio deletes the earliest log file Data Studio.log.1. For example, the Data Studio.log.5 renames to Data Studio.log.4, the Data Studio.log.4 renames to Data Studio.log.3 and so on.

NOTE:

To enable performance logging in the server log file, the configuration parameter log_min_messages must be enabled and value must be set as debug1 in the configuration file data/postgresql.conf, that is, log_min_messages = debug1.

Different Types of Log Levels

The different types of log levels that are displayed in the Data Studio.log file are as follows:

  • TRACE: The TRACE level provides more detailed information than the DEBUG level.
  • DEBUG: The DEBUG level indicates the granular information events that are most useful for debugging an application.
  • INFO: The INFO level indicates the information messages that highlight the progress of the application.
  • WARN: The WARN level indicates potentially harmful situations.
  • ERROR: The ERROR level indicates error events.
  • FATAL: The FATAL level indicates event(s) which cause the application to abort.
  • ALL: The ALL level turns on all the log levels.
  • OFF: The OFF level turns off all the log levels. This is opposite to ALL level.
    NOTE:
    • If the user enters an invalid value to log level, then log level will be set to WARN.
    • If the user does not provide any log level, then log level will be set to WARN.

The logger outputs all messages greater than or equal to its log level.

The order of the standard log4j levels is as follows:

Table 3 Log levels

-

FATAL

ERROR

WARN

INFO

DEBUG

TRACE

OFF

x

x

x

x

x

x

FATAL

x

x

x

x

x

ERROR

x

x

x

x

WARN

x

x

x

INFO

x

x

DEBUG

x

TRACE

ALL

√- Creating a log file x - Not creating a log file

Usamos cookies para aprimorar nosso site e sua experiência. Ao continuar a navegar em nosso site, você aceita nossa política de cookies. Saiba mais

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback