hawq check

Verifies and validates HAWQ platform settings.

Synopsis

hawq check -f hostfile_hawq_check [--hadoop | --hadoop hadoop_home]
        [--stdout | --zipout] [--config config_file]
         [--kerberos] [--hdfs-ha] [--yarn] [--yarn-ha]
hawq check --zipin hawq_check_zipfile

hawq check -d temp_directory
    {-f hostfile_checknet | - h hostname [-h hostname ...]} 
    [ -r n|N|M [--duration time] [--netperf] ] [-D] [-v | -V]

hawq check -?

hawq check --version

Description

The hawq check utility determines the platform on which you are running HAWQ and validates various platform-specific configuration settings as well as HAWQ and HDFS-specific configuration settings. In order to perform HAWQ configuration checks, make sure HAWQ has been already started and hawq config works. For HDFS checks, you should either set the HADOOP_HOME environment variable or give the hadoop installation location using --hadoop option.

The hawq check utility can use a host file or a file previously created with the --zipoutoption to validate platform settings. If GPCHECK_ERROR displays, one or more validation checks failed. You can also use hawq check to gather and view platform settings on hosts without running validation checks. When running checks, hawq check compares your actual configuration setting with an expected value listed in a config file ($GPHOME/etc/hawq check.cnf by default). You must modify your configuration values for “mount.points” and “diskusage.monitor.mounts” to reflect the actual mount points you want to check, as a comma-separated list. Otherwise, the utility only checks the root directory, which may not be helpful.

An example is shown below:

[linux.mount] 
mount.points = /,/data1,/data2 

[linux.diskusage] 
diskusage.monitor.mounts = /,/data1,/data2

Options

–config config_file
The name of a configuration file to use instead of the default file $GPHOME/etc/hawq_check.cnf.

-f hostfile_hawq_check
The name of a file that contains a list of hosts that hawq check uses to validate platform-specific settings. This file should contain a single host name for all hosts in your HAWQ system (master, standby master, and segments).

–hadoop | –hadoop-home hadoop_home
Use this option to specify your hadoop installation location so that hawq check can validate HDFS settings. This option is not needed if HADOOP_HOME environment variable is set.

–kerberos
Use this option to check HDFS and YARN when running Kerberos mode. This allows hawq check to validate HAWQ/HDFS/YARN settings with Kerberos enabled.

–hdfs-ha
Use this option to indicate that HDFS-HA mode is enabled, allowing hawq check to validate HDFS settings with HA mode enabled.

–yarn
If HAWQ is using YARN, enables yarn mode, allowing hawq check to validate the basic YARN settings.

–yarn-ha
Use this option to indicate HAWQ is using YARN with High Availability mode enabled, to allow hawq check to validate HAWQ-YARN settings with YARN-HA enabled.

–stdout
Send collected host information from hawq check to standard output. No checks or validations are performed.

–zipout
Save all collected data to a .zip file in the current working directory. hawq check automatically creates the .zip file and names it hawq_check_timestamp.tar.gz. No checks or validations are performed.

–zipin file
Use this option to decompress and check a .zip file created with the --zipout option. If you specify the --zipin option, hawq check performs validation tasks against the specified file.

–version
Displays the version of this utility.

-? (help)
Displays the online help.

Examples

Verify and validate the HAWQ platform settings by entering a host file and specifying the hadoop location:

# hawq check -f hostfile_hawq_check --hadoop ~/hadoop-2.0.0/

Verify and validate the HAWQ platform settings with HDFS HA enabled, YARN HA enabled and Kerberos enabled:

# hawq check -f hostfile_hawq_check --hadoop ~/hadoop-2.0.0/
                       --hdfs-ha --yarn-ha --kerberos

Verify and validate the HAWQ platform settings with HDFS HA enabled, and Kerberos enabled:

# hawq check -f hostfile_hawq_check --hadoop ~/hadoop-2.0.0/
                       --hdfs-ha --kerberos

Save HAWQ platform settings to a zip file, when HADOOP_HOME environment variable is set:

# hawq check -f hostfile_hawq_check --zipout  

Verify and validate the HAWQ platform settings using a zip file created with the –zipout option:

# hawq check --zipin hawq_check_timestamp.tar.gz

View collected HAWQ platform settings:

# hawq check -f hostfile_hawq_check --hadoop ~/hadoop-2.0.0/ --stdout

See Also

hawq checkperf