Page tree
Skip to end of metadata
Go to start of metadata

Overview

The aie-exec utility, located in the Attivio Platform installation's bin sub-directory, provides command-line access to a suite of executable tools which can be used to manage or interact with an Attivio Platform system. This utility can also be used to easily create new executable tools which take advantage of Attivio libraries.

View incoming links.

Basic Usage

To see a list of the available tools, simply run the aie-exec command with no arguments:

Windows:

<install-dir>\bin\aie-exec.exe

Unix:

<install-dir>/bin/aie-exec

Specify the -h argument after a tool's name to view the options available for that specific tool (see below). You may also refer to the sections below for detailed information on the available tools.

Windows:

aie-exec.exe <tool> -h

Unix:

aie-exec <tool> -h

For example, to see the options on the password (password-generator) tool, run the tool with the -h argument:

Windows:

aie-exec.exe password -h

Unix:

aie-exec password -h

When executing a given command, you can set any Java argument by prefixing it with -J. For instance:

aie-exec encrypt -J-Xmx4g

Available Tools

allowsnapshots

Purpose: Used to allow the store and the index directories in hdfs to be snapshotted, as required by Backup and Restore for Clustered System.  This command needs to be run as an admin user

Options:

Option

Description

Default Value

-p, --propertyFile

Property file (Required). Contains hadoop configuration for connecting to Attivio. See Optional Properties for more info on required properties in this file

false

-e, --projectEnvironment

The project environment being used. Typical values 'dev', 'qa', 'prod'

'default'

-z, --zooKeeper <arg>

Get the list from one of these configuration servers, in the form:  host1:port1,host2:port2,host3:port3, and so on.

 

-n,--projectName <arg>

Project name stored in the configuration server. Required.

 

Example:

./bin/aie-exec allowsnapshots -z example.lab.attivio.com:2181 -p ../snapshotTest/conf/properties/core-app/attivio.core-app.properties -n snapshotTest
log4j:WARN Failed to set property [printStackTrace] to value "". 
log4j:WARN Failed to set property [printStackTrace] to value "". 
2017-01-04 13:21:21,432 INFO  ServiceFactory - Resetting service connections, listeners, and client parameters
2017-01-04 13:21:21,476 INFO  AieZooKeeper - Connecting to zookeeper on example.lab.attivio.com:2181 with connect timeout of 600000 milliseconds
2017-01-04 13:21:21,600 INFO  AieZooKeeper - Connected zookeeper session 15964919a760d97 (previous=null)
2017-01-04 13:21:21,600 DEBUG AieZooKeeper - Zookeeper session timeout 60000
2017-01-04 13:21:21,615 INFO  CuratorHelper - Loading service definitions from jar:file:/home/jcool/attivio/aie55/aie/lib/aie-core-api.jar!/com/attivio/service/aie-services.properties
2017-01-04 13:21:21,618 DEBUG CuratorHelper - ZooKeeper connection info obtained from system property AIE_ZOOKEEPER: example.lab.attivio.com:2181
2017-01-04 13:21:21,696 INFO  CuratorFrameworkImpl - Starting
2017-01-04 13:21:21,713 INFO  ConnectionStateManager - State change: CONNECTED
2017-01-04 13:21:21,714 INFO  CuratorHelper - Connection state change: CONNECTED
2017-01-04 13:21:21,797 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/environments/default/environment.properties
2017-01-04 13:21:21,799 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/factbook/factbook.properties
2017-01-04 13:21:21,802 WARN  PsbProperties - ATTIVIO-CONFIGURATION-10 : Undefined property 'attivio.home' found while processing attivio.factbook.content.dir=${attivio.home}/conf/factbook/content 
2017-01-04 13:21:21,802 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/sail/sail.properties
2017-01-04 13:21:21,804 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/core-app/attivio.core-app.properties
2017-01-04 13:21:21,808 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/advancedtextextraction/advancedtextextraction.properties
2017-01-04 13:21:21,810 INFO  ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/memory/memory.properties
2017-01-04 13:21:21,827 DEBUG HadoopUtils - Logging in kerberos user: jcool@EXAMPLE.COM, keytab=/opt/attivio/jcool.keytab
2017-01-04 13:21:22,192 INFO  ZooProperties - Loaded 118 hadoop properties
2017-01-04 13:21:22,192 DEBUG CuratorHelper - ZooKeeper connection info obtained from system property AIE_ZOOKEEPER: example.lab.attivio.com:2181
2017-01-04 13:21:22,192 DEBUG HadoopUtils - Authentication key HADOOP_SECURITY_AUTHENTICATION is set to kerberos
2017-01-04 13:21:22,193 DEBUG HadoopUtils - Authentication key HADOOP_SECURITY_AUTHENTICATION is set to kerberos
2017-01-04 13:21:22,193 DEBUG HadoopUtils - Authentication key HBASE_SECURITY_AUTHENTICATION is set to kerberos
2017-01-04 13:21:23,147 WARN  NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-01-04 13:21:23,823 DEBUG HadoopUtils - Logging in kerberos user: jcool@EXAMPLE.COM, keytab=/opt/attivio/jcool.keytab
2017-01-04 13:21:23,829 INFO  CuratorHelper - Stopping curator...

analyzegc

Purpose: Used for calculating interesting info from GC log files.

Options:

Option

Description

Default Value

--csv <arg>

Generates output is csv format

false

-h, --help

Prints help

 

--in <arg>

Input file name [REQUIRED]

 

--startTime <arg>

The time the gc log started, format: yyyy-MM-dd'T'HH:mm:ss

 

--summarize <arg>

Whether summary results or detailed csv is desired

true

Example:

C:\attivio\bin>aie-exec.exe analyzegc -in C:\attivio-projects\factbook\logs\memory\gc-local-2012-08-22-095148.log
Minors: 69, Time: total(5.34), avg(0.08), max(0.61), %total(1.27)
Majors: 0, Time: total(0.00), avg(NaN), max(0.00), %total(0.00)
Total elapsed time: 419.37, %time gc: 1.27

analyzestack

Purpose: Used for filtering stack traces of an AIE system to make it easier to find interesting traces.

Options:

Option

Description

Default Value

--componentNameMapFile <arg>

If present, the file (expected to be in Java properties format) is parsed and used as a component name to component name map.

 

--filterAvm <arg>

If true, filters out all avm message receiver threads

true

--filterBlockedOnSend <arg>

If true, filters out all threads blocked trying to send a message

true

--filterIdleHttp <arg>

If true, filters out all idle HTTP receiver threads

true

--filterMisc <arg>

If true, filters out a bunch of miscellaneous threads and JVM system threads

true

--filterParked <arg>

If true, filters out all parked threads (those waiting for messages to process)

true

--filterTimer <arg>

If true, filters out all timer threads

true

--filterWaitingComponentManager <arg>

If true, filters out all threads waiting for the component manager to grab a message

true

--filterWaitingForComponent <arg>

If true, filters out threads waiting for a component instance to be available

false

-h, --help

Prints help

 

--in <arg>

Input file name [REQUIRED]

 

--out <arg>

The output file for results

 

--printAnalysis <arg>

If true, prints an analysis of the current state

true

--printComponentSummary <arg>

If true, prints a summary of all active components

true

--printStateSummary <arg>

If true, prints a summary of threads by state and type

true

--threadNameFilter <arg>

If set, other filter flags are ignored and only threads matching this regexp are kept

 

Example:

aie-exec analyzestack -in C:\attivio-projects\factbook\logs-local\attivio.error.log -out C:\attivio-projects\factbook\logs-local\analyzestack.txt

backup

Purpose: Starts an index backup in a single-node project.

Options:

OptionDescriptionDefault Value
–commit <arg>Whether a commit should be performed prior to backup (true or false).true
--credentials <arg>HTTP authentication credentials needed if the server requires authentication (example: username/password)
-e, --projectEnvironment <arg>The project environment being used: dev, qa, and so on.default
-h, --helpPrints help.
-n, --projectName <arg>Project name stored in ZooKeeper.
-W, --workflow <arg>Workflow to send backup message through. [REQUIRED]
--wait-for-completion <arg>Wait for message to be completely processed before returning (true or false).true
--wait-for-completion-timeout <arg>Number of milliseconds to wait for completion.-1
-z, --zooKeeper <arg>ZooKeeper connection information (comma-separated list of colon-separated host:port references: host1:port1,host2:port2,host3:port3[,…])

check

Purpose: Checks and recovers indexes.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

--data-directory <arg>

Data directory for AttivioEngine [REQUIRED]

 

--fix

Fixes all broken indexes

 

--fix-index

Fixes the primary index (if broken)

 

--fix-realtime

Fixes the real-time update index (if broken)

 

--index-directory <arg>

Data directory for primary index

 

--skip-index

Skips the primary index when checking

 

--skip-realtime

Skips real time fields when checking

 

-v, --verbose

Displays verbose information about engine

 

Example:

C:\attivio\bin>aie-exec.exe check --data-directory C:\attivio-projects\factbook\data-local\index\index
2012-08-22 16:36:24,483 INFO  CheckEngine - Checking Primary Index at: C:\attivio-projects\factbook\data-local\index\index\index
2012-08-22 16:36:27,073 INFO  CheckEngine - Primary Index (commit=segments_10, segments=1, version=FORMAT_3_1 [Lucene 3.1+]) is OK
2012-08-22 16:36:27,107 INFO  CheckEngine -   Segment Info: _y, compound=false, files=8, documents=3740, deleted=0, size=5.437MB
2012-08-22 16:36:27,107 INFO  CheckEngine - Checking Real Time Field Index at: C:\attivio-projects\factbook\data-local\index\index\rtf
2012-08-22 16:36:27,169 INFO  CheckEngine - Real Time Field Index (commit=segments_8, segments=1, version=FORMAT_3_1 [Lucene 3.1+]) is OK
2012-08-22 16:36:27,169 INFO  CheckEngine -   Segment Info: _6, compound=false, files=10, documents=5, deleted=0, size=0.001MB

This tool provides the ability to check that the indexes for an AttivioEngine have not been corrupted. An exit code of 0 will be returned if all indexes are valid. Log messages will indicate which indexes are corrupted. To get a full dump of all index status, use --verbose.

This tool can be destructive if using one of the --fix options. If using one of these options, it is suggested that you back up your data directory prior to running this tool.

checkHadoop

Purpose: Check basic Hadoop services are available.  The check can be run for the Attivio platform, or for a specific Attivio project.

Note: This tool runs the checkHDFS and checkHBase tools to verify that Attivio has appropriate access to these services. The tools can also be run individually. 

Options:

Option

Description

-h, --help

Prints help

-e, --projectEnvironment <arg>The project environment being used. Optional. Defaults to 'default'. Typical other values are: 'dev', 'qa', 'uat', 'prod'. If the project environment is provided, it will be used to determine the hbase store namespace to use for testing.

-n,--projectName <arg>

Project name stored in the configuration server. Optional. If the project is provided, it will be used to determine the hbase store namespace to use for testing.

-p, --propertyFileProperty file (Required). Contains hadoop configuration for connecting to Attivio including Kerberos and namespace information. The  <project-directory>/conf/properties/core-app/attivio.core-app.properties file can be used if a project already exists. See Project Properties for more information.

-z, --zooKeeper <arg>

Get the list from one of these configuration servers, in the form:  host1:port1,host2:port2,host3:port3, and so on.

Example:

/opt/attivio/bin/aie-exec checkHadoop -z vmnode01:2181,vmnode02:2181,vmnode03:2181 -p ./hadoopconfig.properties


checkHDFS

Purpose: Check basic HDFS operations such as write, append, read, delete, and replicate

Options - see checkHadoop for options and properties


Example:

/opt/attivio/bin/aie-exec checkHDFS -z vmnode01:2181,vmnode02:2181,vmnode03:2181 -p ./hadoopconfig.properties

checkHBase

Purpose: Check basic HBase operations such as write, append, read, delete, and replicate

Options - see checkHadoop for options and properties

Example:

/opt/attivio/bin/aie-exec checkHBase -z vmnode01:2181,vmnode02:2181,vmnode03:2181 -p ./hadoopconfig.properties

commit

Purpose: Initiate a commit of the index.

Note: A local AIE instance must be running.

Options:

Option

Description

Default Value

--credentials <arg>HTTP authentication credentials needed if the server requires authentication. 

--force <arg>

Whether commit should be forced

false

-h, --help

Prints help

 

-i, --ingest-uri <arg>

URI for the IngestReceiver [REQUIRED]. The ingestReceiver is the IngestService's http address, which defaults to http://<hostname>:<baseport+1>/doc. By default there is an IngestService on every node.  Example: http://localhost:17001/doc.

 

-r, --message-result-uri <arg>

URI for MessageResultReceiver. Required if wait-for-completion is false. The default MessageResultReceiver URI is http://<hostname>:<baseport+1>/docResult. Example: http://localhost:17001/docResult.

 

-W, --workflow <arg>

Workflow to send message through [REQUIRED].

 

--wait-for-completion <arg>

Wait for message to be completely processed before returning

true

--wait-for-completion-timeout <arg>

Number of milliseconds to wait for completion

-1 (no timeout)

Example:

aie-exec.exe commit -W indexer -i http://localhost:17001/doc -r http://localhost:17001/docResult

compilelemmas

Purpose: Converts a CSV list of lemmas into FST format. Compilelemmas is an internal tool, not to be used by the public.

OptionDescriptionDefault Value
-iCSV input file (required) 
-oFST output file (required) 

createarchetypes

Purpose: Creates an archetype to kick start the client development process

Options:

NOTE: None of these options are required and those without a short option are considered andvanced

OptionDescriptionDefault ValueAllowed Values
-tType of archetype to createmodulemodule, client
-oOutput directory where the archetype project will go<install_directory>/sdk any valid directory
--installVersionVersion of Attivio (or of the specific archetype) to use<install_version>any valid Attivio version (5.6, 5.5, 5.2.6, 4.4) or Attivio archetype version
--attivioHome

Installation directory to use for dependencies for a client archetype.

This should only be set if the archetype version differs from the Attivio version

<install_directory>any valid Attivio installation directory of the same version as 'installVersion'
--mavenExecThe location of the Maven executable (or the name of the executable assuming it is on the path)mvnany valid Maven executable
--mavenArgsAny additional maven arguments to be added to the archetype generation command, separate these arguments with a space any maven arguments

encrypt

Purpose: Encrypts a string value. Used for creating secure passwords.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

--password <arg>

Password to encrypt

 

Example: See the full example at Securing the Receiver Port.

C:\attivio\bin>aie-exec.exe encrypt --password mypassword
xz304C2tOK9lWg/EXVYoDw==

exportstoresnapshot

(included in latest 5.2.6 patch and runnable as com.attivio.commandline.StoreSnapshotExporter)

Purpose: Used to export a given local store snapshot to a given path.  This command needs to be run as an hbase/admin user.

Options:

Option

Description

Default Value

-d, --destinationDestination path (Required). The hdfs node that you would like to export the snapshot to 
-s, --snapshotSnapshot name (Required). The name of the store snapshot to export 

-p, --propertyFile

Property file (Required). Contains hadoop configuration for connecting to Attivio. See Optional Properties for more info on required properties in this file

false

-e, --projectEnvironment

The project environment being used. Typical values 'dev', 'qa', 'prod'

'default'

-z, --zooKeeper <arg>

Get the list from one of these configuration servers, in the form:  host1:port1,host2:port2,host3:port3, and so on. (Required)

 

-n,--projectName <arg>

Project name stored in the configuration server. (Required)

 

Example:

./aie_5.2.6/bin/aie-exec com.attivio.commandline.StoreSnapshotExporter -n project -z zookeeper.host:2181 -d hdfs://export.host:8020/storeExports -p ./path/to/properties.properties -s store_snapshot_name
log4j:WARN Failed to set property [printStackTrace] to value "". 
log4j:WARN Failed to set property [printStackTrace] to value "". 
2017-03-15 16:01:31,535 INFO ServiceFactory - Resetting service connections, listeners, and client parameters
2017-03-15 16:01:31,549 INFO AieZooKeeper - Connecting to zookeeper on zookeeper.host:2181 with connect timeout of 600000 milliseconds
2017-03-15 16:01:31,654 INFO AieZooKeeper - Connected zookeeper session 15aa582978364ff (previous=null)
2017-03-15 16:01:31,654 DEBUG AieZooKeeper - Zookeeper session timeout 60000
2017-03-15 16:01:31,664 INFO CuratorHelper - Loading service definitions from jar:file:/opt/attivio/aie_5.2.6/lib-override/aie-patch-5.2.6.62.jar!/com/attivio/service/aie-services.properties
2017-03-15 16:01:31,666 DEBUG CuratorHelper - ZooKeeper connection info obtained from system property AIE_ZOOKEEPER: zookeeper.host:2181
2017-03-15 16:01:31,754 INFO CuratorFrameworkImpl - Starting
2017-03-15 16:01:31,791 INFO ConnectionStateManager - State change: CONNECTED
2017-03-15 16:01:31,793 INFO CuratorHelper - Connection state change: CONNECTED
2017-03-15 16:01:32,234 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/environments/default/environment.properties
2017-03-15 16:01:32,236 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/memory/memory.properties
2017-03-15 16:01:32,239 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/core-app/attivio.core-app.properties
2017-03-15 16:01:32,242 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/factbook/factbook.properties
2017-03-15 16:01:32,245 WARN PsbProperties - ATTIVIO-CONFIGURATION-10 : Undefined property 'attivio.home' found while processing attivio.factbook.content.dir=${attivio.home}/conf/factbook/content 
2017-03-15 16:01:32,245 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/advancedtextextraction/advancedtextextraction.properties
2017-03-15 16:01:32,247 INFO ZooProperties - Loading properties from project path /configuration/configurationFiles/conf/properties/sail/sail.properties
2017-03-15 16:01:32,583 INFO ZooProperties - Loaded 164 hadoop properties
2017-03-15 16:01:32,583 DEBUG CuratorHelper - ZooKeeper connection info obtained from system property AIE_ZOOKEEPER: zookeeper.host:2181
2017-03-15 16:01:32,587 DEBUG HadoopUtils - Authentication key HADOOP_SECURITY_AUTHENTICATION is set to kerberos
2017-03-15 16:01:32,587 DEBUG HadoopUtils - Authentication key HADOOP_SECURITY_AUTHENTICATION is set to kerberos
2017-03-15 16:01:32,587 DEBUG HadoopUtils - Authentication key HBASE_SECURITY_AUTHENTICATION is set to kerberos
2017-03-15 16:01:42,036 INFO HadoopUtils - Using HBase namespace attivio_project
2017-03-15 16:01:42,808 WARN NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-03-15 16:01:55,450 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:55,562 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:55,671 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:55,780 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:55,888 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,013 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,140 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,268 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,394 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,519 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:01:56,764 DEBUG HadoopUtils - Logging in kerberos user: attivio_user, keytab=/opt/attivio/attivio.keytab
2017-03-15 16:02:03,738 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:03,849 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:03,958 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,066 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,192 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,301 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,409 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,520 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,629 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,739 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,859 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:04,982 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,107 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,232 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,360 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,490 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,619 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,753 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:05,883 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,006 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,132 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,260 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,391 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,519 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,642 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,768 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:06,894 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,015 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,139 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,262 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,392 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,504 WARN Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
2017-03-15 16:02:07,804 INFO CuratorHelper - Stopping curator...

getStacks

Purpose: Dumps Java stack traces, similar to the JMX jstack utility.

Options:

Option

Description

Default Value

--host <arg>

The AIE JMX host name

localhost

--password <arg>

The password for authenticated endpoints.

 

--port <arg>

The AIE JMX port

17004

--username <arg>

The username for authenticated endpoints

 

Linux Example:

bin\aie-exec getStacks --host localhost --port 17004 

importstoresnapshot

(included in latest 5.2.6 patch and runnable as com.attivio.commandline.StoreSnapshotImporter)

Purpose: Used to restore the store from a snapshot located at a given path.  This command needs to be run as an hbase/admin user.

Options:

Option

Description

Default Value

-s, --sourceSource Path (Required). The source path to import the snapshot from (must include the name of the snapshot) 

-p, --propertyFile

Property file (Required). Contains hadoop configuration for connecting to Attivio. See Optional Properties for more info on required properties in this file

false

-e, --projectEnvironment

The project environment being used. Typical values 'dev', 'qa', 'prod'

'default'

-z, --zooKeeper <arg>

Get the list from one of these configuration servers, in the form:  host1:port1,host2:port2,host3:port3, and so on. (Required)

 

-n,--projectName <arg>

Project name stored in the configuration server. (Required)

 

jmeterTestGen

The jmeter module was removed in Attivio Platform 5.5.1. See the Load Testing Using JMeter page and the loadgen and querygen tools on this page for more on load testing in this version.


Purpose: Used with the jmeter module to generate jmeter test files based on a particular system configuration.

Options:

Option

Description

Default Value

-f, --queryFile <arg>

Name of the file containing line-separated keywords or phrases to use for executing queries. Multiple comma-separated files accepted.

 

-f,--facet <arg>

Name of a facet to request when executing queries.

 

--facetsPerQuery <arg>

Number of facets to request on each query (randomly selected, 0 for all)

0

-i,--ingestionNodeName <arg>

Name of the node that will be contacted for executing commits and connectors.

 

-l,--logPrefix <arg>

Prefix (including directory location) for log files created by JMeter test execution.

 

-n,--projectName <arg>

Project name stored in the configuration server. Required.

 

-o,--output <arg>

Output file name.

<projectName>.jmx

-q,--queryNodeName <arg>

Name of the node that will be contacted for executing queries.

 

-r,--realm <arg>

Name of the realm for executing secured queries.

1

-s,--seed <arg>

Random generator seed, -1 for time-based seed

1

-u,--users <arg>

Name of the file containing line-separated principal names to use for executing secured queries.

1

-z, --zooKeeper <arg>

Zookeeper connection information, in the form:  host1:port1,host2:port2,host3:port3, and so on.

 

Example: From Load Testing Using JMeter.

C:\attivio\bin>aie-exec.exe jmetertestgen -z vmnode-073:16080 -n myProject

jmx

Purpose: Access JMX beans, properties and methods.

Options:

Option

Description

Default Value

--attribute <arg>

The attribute value

 

--fullList

If true, all MBeans, including low-level ones will be listed when listing beans

false

-h, --help

Prints help

 

--host <arg>

The AIE JMX host name

localhost

--mbean <arg>

The MBean to use. If not provided a list of all MBeans is displayed.

 

--operation <arg>

The operation to perform

 

-p, --params <arg>

The parameters for operation

 

--port <arg>

The AIE JMX port

17004

Linux Example: 

bin\aie-exec jmx --host localhost --port 17004 --mbean AttivioAdmin:class=com.attivio.admin.api.ConfigurationApi,name=bean.configurationApi

A number of JMX Domains are registered via Attivio code as well as third party tools we work with. Only the AttivioAdmin and its contained MBeans are supported. Serious stability issues including system deadlock can occur if unsupported JMX beans are used.

listlocks

Purpose: Prints information about currently held locks on a specified project. See PLAT-33278.

OptionDescription
-p, -projectName <arg>The project name (required).
-z, -zooKeeper <arg>

Zookeeper connection information.

Value of the form: host:port,host:port,host:port

listprojects

Purpose: Lists all projects currently deployed to the configuration server. See PLAT-33278.

Options:

Option

Description

-z, --zooKeeper <arg>

Get the list from one of these configuration servers, in the form:  host1:port1,host2:port2,host3:port3, and so on.

Example: 

C:\attivio\bin>aie-exec.exe listprojects -z localhost:16080

loadgen

Purpose: Produce artificial email and document ingestion load.

The loadgen command pushes artificial data to an AIE cluster via the Java API at a desired rate.  Details: Loadgen

modulemanager

Purpose: Install, list, or remove Attivio modules.

Notes: Module installation or removal must be performed separately on each host's Attivio installation.

Options:

OptionDescriptionDefault Value
-h, --helpPrints help
-i, --installModule <arg>

Installs module from ZIP archive with path and filename specified as <arg>

Example: aie-exec -i /home/users/myuser/sharepoint-1.0.0.zip

Windows Notes:

  1. Installation command must be run from an elevated ("Run as administrator") Command Prompt session.
  2. Module ZIP archive paths/filenames must be specified as file:// URLs, rather than using standard Windows paths.

Example: aie-exec -i file:///C:/Users/myuser/Downloads/sharepoint-1.0.0.zip


-l, --listModulesLists currently-installed modules and their versions
-n, --dryRunDisplays changes that will result from running a modulemanager command but does not actually make those changes.
Useful for previewing the effects of an --installModule or --removeModule operation.

-r, --removeModule <arg>

Removes module with name specified as <arg>

Example: aie-exec -r sharepoint

Windows Notes:

  1. Removal command must be run from an elevated ("Run as administrator") Command Prompt session.

optimize

Purpose: Initiate an optimize of the index.

Options:

Option

Description

Default Value

--force <arg>

Whether optimize should be forced

true

-h, --help

Prints help

 

-i, --ingest-uri <arg>

URI for the IngestReceiver [REQUIRED]. The ingestReceiver is the IngestService's http address, which defaults to http://<hostname>:<baseport+1>/doc. By default there is an IngestService on every node.  Example: http://localhost;17001/doc.

 

-r, --message-result-uri <arg>

URI for MessageResultReceiver. Required if wait-for-completion is false. The default MessageResultReceiver URI is http://<hostname>:<baseport+1>/docResult.  Example: http://localhost:17001/docResult.

 

-W, --workflow <arg>

Workflow to send message through [REQUIRED]

 

--wait-for-completion <arg>

Wait for message to be completely processed before returning

true

--wait-for-completion-timeout <arg>

Number of milliseconds to wait for completion

-1 (no timeout)

Example:

aie-exec.exe optimize -W indexer -i http://localhost:17001/doc -r http://localhost:17001/docResult

password

Purpose: Generate encoded passwords.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

-p, --password <arg>

The password to encode (will ask from console if not supplied)

 

-u, --username <arg>

The username to encode the password for

 

Example: From Using a Custom Certificate.

C:\attivio\bin>aie-exec.exe password -p <your password>
<your password>
OBF:1xfd1zt11uha1ugg1zsp1xfp
MD5:a029d0df84eb5549c641e04a9ef389e5

processbuilder

The processbuilder is an internal executable. See PLAT-20794.

querygen

Purpose: Submit queries defined in a JSON input file to Attivio for load testing.

Options:

OptionDescriptionDefault Value
-c, --config <arg>Location and filename of querygen JSON filequerygen.json
-g,--generateSampleConfigOutput a sample querygen JSON file

-h, --help

Prints help



schedule

Purpose: Schedule connectors to run at specific times.

Options:

Option

Description

Default Value

-c, --connector-uri <arg>

The connector URI, local of http [REQUIRED]

 

--cron <arg>

The CRON expression to run connector

 

-h, --help

Prints help

 

-n, --task-name <arg>

The name of scheduled task

 

--overwrite <arg>

Overwrites existing schedule for this task name

true

--owner <arg>

The scheduled task owner

admin

--remove <arg>

Removes existing schedule for this task name

false

-s, --scheduler-uri <arg>

The schedule service URI [REQUIRED]

http://localhost:17001/scheduler

Example: From Scheduling Tasks.

C:\attivio\bin>aie-exec.exe schedule -s http://localhost:17001/scheduler --cron "0 0,30 * * * ?" --connector-uri dbConnector

Purpose: Run a search from the command line and get results in XML.


Note: There must be an instance of AIE running.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

-l, --query-language <arg>

The query language

advanced

--password <arg>

The password for authenticated endpoints

 

-q, --query <arg>

The query string [REQUIRED]

 

-u, --query-uri <arg>

The query service URI

http://localhost:17001/query

--username <arg>

The username for authenticated endpoints

 

-W, --workflow <arg>

The query workflow

search

Example:

C:\attivio\bin>aie-exec.exe search -l simple -q "Bounty"

volumetrics

Purpose: Produce a report of ingestion and query performance along with resource utilization.

The volumetrics command attaches to a running AIE cluster and interrogates performance monitoring data to produce a report.  Details: Volumetrics


uploadSignals

Purpose: Upload external signal data to signal store for input to Machine Learning relevancy tuning. See Signal Tracking API for more specific information regarding signals.


Note: There must be an instance of AIE running.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

-s <arg>

The signal type for all uploaded signals

<required>

-q <arg>

The query language for queries specified in the input file

simple

--replace <arg>

Indicate if previous signals for the specified signal type should be deleted prior to uploading.

false

-r <arg>

Relevancy model name.
The signal data uploaded will only be used to train models with the specified names.
This argument can be specified multiple times to support using this training data with multiple models.

<required>

-P <arg>

REST parameter for QueryRequest.
This argument can be specified multiple times to pass multiple rest parameters to the query request.

This can be used to specify a custom workflow, search profile, etc for ensuring the query used for generating signals will use the same processing that end user queries will use.
See HTTP REST APIs for documentation on available rest parameters.

 

-i <arg>

The input file.
Input file is a csv file with the following columns:
<query>,<weight>,<documentId>

Higher weights in indicate more important documents for a given query.
Multiple documents can have the same weight, indicating they have equivalent/indistinguishable relevancy for the given query.

<required>

--ttl <arg>Indicate if TTL will be honored for signals uploaded with this executable.
By default, TTL will not be honored for uploaded signals, which will result in the uploaded signals being persistent.
Specifying true for this argument will render the uploaded signals ephemeral, and they will be deleted automatically after they expire.
false
-c, --context <arg>

If specified, this setting will indicate the number of documents that should be used to generate the context for the uploaded signals.
Documents that are used for context will be other documents that match the query, but not specified in the input file.
Documents used for context will be given a weight of 0, indicating all specified documents are "better" than the contextual docs.

0
-f, --filter <arg>Specify a filter (in advanced query syntax) to apply to the query used for resolving context.
If context is 0, then the filter will be ignored.
<null>

-z, --zooKeeper <arg>

ZooKeeper host and port. Typically -z localhost:16980 for single-node projects, -z <zooKeeper-hostname>:2181 for Hadoop projects.

 

-n, --projectName <arg>

Project name stored in the configuration server. Required.

 
-e, --environment <arg>The project environment being used (prod, dev, qa, etc.).default

Example:

C:\attivio\bin>aie-exec.exe uploadSignals -s supervised -r default -r mlmodel --replace true -q simple -i inputfile.csv -P workflow=search


trainRelevancy

(com.attivio.platform.relevancy.TrainRelevancyTool)

Purpose: Train a relevancy model using external signal data. See Training Relevancy Models for more information.

Note: There must be an instance of AIE running.

Options:

Option

Description

Default Value

-h, --help

Prints help

 

-T, --template <arg>Specify a JSON template to use for the base relevancy model. 

-q <arg>

The query language for queries specified in the input file

simple

--publish <arg>

Indicate if the relevancy model should be published on successful training.

false

-t, --target-accuracy <arg>

The target accuracy for trainging relevancy models.
Training will continue until this accuracy is achieved, or until the max number of iterations has been exhausted.
If this accuracy cannot be achieved, the training will fail and no model will be produced.

90.0
-m, --max-iterations <arg>

The maximum number of iterations to attempt training the relevancy model.

5

-r <arg>

The name of the relevancy model to train.

<required>

-P <arg>

REST parameter for QueryRequest.
This argument can be specified multiple times to pass multiple rest parameters to the query request.

This can be used to specify a custom workflow, search profile, etc for ensuring the query used for generating signals will use the same processing that end user queries will use.
See HTTP REST APIs for documentation on available rest parameters.

 

-i <arg>

The input file.
Input file is a csv file with the following columns:
<query>,<weight>,<documentId>

Higher weights in indicate more important documents for a given query.
Multiple documents can have the same weight, indicating they have equivalent/indistinguishable relevancy for the given query.

<required>

-c, --context <arg>

Specify the number of documents to use per-query for computing accuracy and to use as feedback for improving quality of relevancy models iteratively.

10

-z, --zooKeeper <arg>

ZooKeeper host and port. Typically -z localhost:16980 for single-node projects, -z <zooKeeper-hostname>:2181 for Hadoop projects.

 

-n, --projectName <arg>

Project name stored in the configuration server. Required.

 
-e, --environment <arg>The project environment being used (prod, dev, qa, etc.).default

Example:

C:\attivio\bin>aie-exec.exe com.attivio.platform.relevancy.TrainRelevancyTool -r default --publish true -q simple -i inputfile.csv -P workflow=search