Command-Line Basic Client
AMPS contains a command-line client spark
, which can be used to run queries, place subscriptions, and publish data. While it can be used for each of these purposes, spark
is provided as a useful tool for informal testing and troubleshooting of AMPS instances. For example, you can use spark
to test whether an AMPS instance is reachable from a particular system, or use spark
to perform ad hoc queries to inspect the data in AMPS.
This chapter describes the commands available in the spark
utility. For more information on the features available in AMPS, see the relevant chapters in this guide.
The spark
utility is included in the bin
directory of the AMPS install location. The spark
client is written in Java, so running spark
requires a Java Virtual Machine for Java 8 or later.
To run this client, simply type ./bin/spark
at the command line from the AMPS installation directory. It will output its help screen as shown below, with a brief description of the spark
client features.
%> ./bin/spark
===============================
- Spark - AMPS client utility -
===============================
Usage:
spark help [command]
Supported Commands:
help
ping
publish
sow
sow_and_subscribe
sow_delete
subscribe
Example:
%> ./spark help sow
Returns the help and usage information for the 'sow' command.
Getting Help with Spark
spark
requires that a supported command is passed as an argument. Within each supported command, there are additional unique requirements and options available to change the behavior of spark
and how it interacts with the AMPS engine.
For example, if more information was needed to run a publish
command in spark
, the following would display the help screen for the spark
client's publish
feature.
%>./spark help publish
===============================
- Spark - AMPS client utility -
===============================
Usage:
spark publish [options]
Required Parameters:
server -- AMPS server to connect to
topic -- topic to publish to
Options:
authenticator -- Custom AMPS authenticator factory to use
delimiter -- decimal value of message separator character
(default 10)
delta -- use delta publish
file -- file to publish records from, standard in when omitted
proto -- protocol to use (amps, fix, nvfix, xml)
(type, prot are synonyms for backward compatibility)
(default: amps)
rate -- decimal value used to send messages
at a fixed rate. '.25' implies 1 message every
4 seconds. '1000' implies 1000 messages per second.
Example:
% ./spark publish -server localhost:9003 -topic Trades -file data.fix
Connects to the AMPS instance listening on port 9003 and publishes records
found in the 'data.fix' file to topic 'Trades'.
Spark Commands
Below, the commands supported by spark
will be shown, along with some examples of how to use the various commands and descriptions of the most commonly used options. For the full range of options provided by spark
, including options provided for compatibility with previous spark
releases, use the spark help
command as described above.
Publish
The publish
command is used to publish data to a topic on an AMPS server.
Common Options - spark publish
Option | Definition |
---|---|
| AMPS server to connect to. |
| Topic to publish to. |
delimiter | Decimal value of message separator character (default 10). |
delta | Use delta publish (sends a delta_publish command to AMPS). |
file | File to publish messages from, stdin when omitted. The file provided to this argument can be either uncompressed or compressed in ZIP format. If a ZIP file is provided, it must contain one message per file in the same structure produced by |
proto | Protocol to use. In this release, Defaults to |
rate | Messages to publish per second. This is a decimal value, so values less than 1 can be provided to create a delay of more than a second between messages. '.25' implies 1 message every 4 seconds. '1000' implies 1000 messages per second. |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
type | For protocols and transports that accept multiple message types on a given transport, specifies the message type to use. |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
The examples shown below will demonstrate how to publish records to AMPS using the spark
client in one of the three following ways: a single record, a python script or by file.
Publish a Single Message
%> echo '{ "id" : 1, "data": "hello, world!" }' | \
./spark publish -server localhost:9007 -type json -topic order
total messages published: 1 (50.00/s)
In the example above, a single record is published to AMPS using the echo
command. If you are comfortable with creating records by hand this is a simple and effective way to test publishing in AMPS.
The JSON message is published to the topic order on the AMPS instance. Assuming the order topic is configured as a SOW topic, this publish can be followed with a sow
command in spark
to test if the record was indeed published to the order topic.
Publish using Python
%> python -c "for n in range(100): print('{\"id\":%d}' % n)" | \
./spark publish -topic disorder -type json -rate 50 \
-server localhost:9007
total messages published: 100 (50.00/s)
In the example above, the -c
flag is used to pass in a simple loop and print command to the python interpreter and have it print the results to stdout
.
The python script generates 100 JSON messages of the form {"id":0}
, {"id":1}
... {"id":99}
. The output of this command is then piped to spark using the |
character, which will publish the messages to the disorder topic inside the AMPS instance.
Publish from a File
%> ./spark publish -server localhost:9007 -type json -topic chaos \
-file data.json
total messages published: 50 (12000.00/s)
Generating a file of test data is a common way to test AMPS functionality. The example above demonstrates how to publish a file of data to the topic chaos in an AMPS server. As previously mentioned, spark
interprets each line of a text file as a distinct message.
SOW
The sow
command allows a spark
client to query the latest messages which have been persisted to a topic. The SOW in AMPS acts as a database last update cache, and the sow
command in spark
is one of the ways to query the database. This sow
command supports regular expression topic matching and content filtering, which allow a query to be very specific when looking for data.
For the sow
command to succeed, the topic queried must provide a SOW. This includes SOW topics and views, queues, and conflated topics. These features of AMPS are discussed in more detail in this guide.
Common Options - spark sow
Option | Definition |
---|---|
| AMPS server to connect to. |
| Topic to query. |
batchsize | Batch Size to use during query. A batch size > 1 can help improve performance, as described in the Querying the State of the World chapter of this guide. |
copy | Publishes records to the secondary server specified. |
filter | The content filter to use. |
format | Optional format used for displaying messages. May contain literal separator characters mixed with format tags. Allowed tags are:
Notice that not all headers may be available on every request, depending on the options provided to the request. See the AMPS Command Reference for details. Example: |
orderby | An expression that AMPS will use to order the results. |
proto | Protocol to use. In this release, Defaults to |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
topn | Request AMPS to limit the query response to the first N records returned. |
type | For protocols and transports that accept multiple message types on a given transport, specifies the message type to use. |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
%> ./spark sow -server localhost:9007 -type json -topic order \
-filter "/id = '1'"
{ "id" : 1, "data" : "hello, world" }
Total messages received: 1 (Infinity/s)
This sow
command will query the order topic and filter results which match the xpath expression /id = '1'
. This query will return the results in the topic, for example, the record published in the previous publish command.
If the topic does not provide a SOW, the command returns an error indicating that the command is not valid for that topic.
Subscribe
The subscribe
command allows a spark client to register an interest in incoming messages to a topic, so that they will be delivered in real time. Similar to the sow
command, the subscribe
command supports regular expression topic matching and content filtering, which allow a subscription to be very specific when looking for data as it is published to AMPS. Unlike the sow
command, a subscription can be placed on a topic which does not have a persistent SOW cache configured. This allows a subscribe
command to be very flexible in the messages it can be configured to receive.
Common Options - spark subscribe
Option | Definition |
---|---|
| AMPS server to connect to. |
| Topic to subscribe to. |
ack | Enable acknowledgments when receiving from a queue. Notice that, when this option is provided, |
backlog | Request a max_backlog of greater than 1 when receiving from a queue. (See the chapter on Message Queues in this guide for more information.) |
copy | Publishes records to the secondary server specified. |
delta | Use delta subscription (sends a delta_subscribe command to AMPS). |
filter | Content filter to use. |
format | Optional format used for displaying messages. May contain literal separator characters mixed with format tags. Allowed tags are:
Notice that not all headers may be available on every request, depending on the options provided to the request. See the AMPS Command Reference for details. Example: |
proto | Protocol to use. In this release, Defaults to |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
type | For protocols and transports that accept multiple message types on a given transport, specifies the message type to use. |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
%> ./spark subscribe -server localhost:9007 -topic chaos \
-type json -filter "/name = 'cup'"
{ "name" : "cup", "place" : "cupboard" }
The example above places a subscription on the chaos topic with a filter that will only return results for messages where /name = 'cup'
. If we place this subscription before a matching message is published, then we will get results similar to above.
SOW and Subscribe
The sow_and_subscribe
command is a combination of the sow
command and the subscribe
command. When a sow_and_subscribe
is requested, AMPS will first return all messages which match the query and are stored in the SOW. Once this has completed, all messages which match the subscription will then be sent to the client as they update the topic.
The sow_and_subscribe
is a powerful tool to use when it is necessary to examine both the contents of the SOW, and the live subscription stream.
Common Options - spark sow_and_subscribe
Option | Definition |
---|---|
| AMPS server to connect to. |
| Topic to query and subscribe to. |
batchsize | Batch size to use during query. |
copy | Publishes records to the secondary server specified. |
delta | Request delta for subscriptions (sends a sow_and_delta_subscribe command to AMPS). |
filter | Content filter to use. |
format | Optional format used for displaying messages. May contain literal separator characters mixed with format tags. Allowed tags are:
Notice that not all headers may be available on every request, depending on the options provided to the request. See the AMPS Command Reference for details. Example: |
orderby | An expression that AMPS will use to order the SOW query results. |
proto | Protocol to use. In this release, Defaults to |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
topn | Request AMPS to limit the SOW query results to the first N records returned. |
type | For protocols and transports that accept multiple message types on a given transport, specifies the message type to use. |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
%> ./spark sow_and_subscribe -server localhost:9007 -type json \
-topic chaos -filter "/name = 'cup'"
{ "name" : "cup", "place" : "cupboard" }
In the previous example, the same topic and filter are being used as in the sow_and_subscribe
example above. The results of this query initially are similar, since only the messages which are stored in the SOW are returned. If a publisher were started that published data to the topic that matched the content filter, those messages would then be printed out to the screen in the same manner as a subscription.
SOW Delete
The sow_delete
command is used to remove records from the SOW topic in AMPS. If a filter is specified, only messages which match the filter will be removed. If a file is provided, the command reads messages from the file and sends those messages to AMPS. AMPS will delete the matching messages from the SOW. If no filter or file is specified, the command reads messages from standard input (one per line) and sends those messages to AMPS for deletion.
It can be useful to test a filter by first using the desired filter in a sow
command and making sure the records returned match what is expected. If that is successful, then it is safe to use the filter for a sow_delete
. Once records are deleted from the SOW, they are not recoverable.
Common Options - spark sow_delete
Option | Definition |
---|---|
| AMPS server to connect to. |
| Topic to delete records from. |
file | File from which to read messages to be deleted. |
filter | Content filter to use. Notice that a filter of |
proto | Protocol to use. In this release, Defaults to |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
type | For protocols and transports that accept multiple message types on a given transport, specifies the message type to use. |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
%> ./spark sow_delete -server localhost:9007 \
-topic chaos -type json -filter "/name = 'cup'"
Deleted 1 records in 10ms.
With the sow_delete
command above, we are asking for AMPS to delete records in the topic chaos which match the filter /name = 'cup'
. In this example, we delete the record we queried previously in the sow_and_subscribe
example. spark
reports that one matching message was removed from the SOW topic.
Ping
The spark ping
command is used to connect to the amps instance and attempt to logon. This tool is useful to determine if an AMPS instance is running and responsive.
Common Options - spark ping
Option | Definition |
---|---|
| AMPS server to connect to. |
proto | Protocol to use. In this release, Defaults to |
| Specifies whether to use an SSL-secured connection ( Values: Default: |
uriopts | Custom connection URI parameters to be passed in the URI query string. For example:
|
| Allows a custom URI scheme to be specified. When both |
Examples
%> ./spark ping -server localhost:9007 -type json
Successfully connected to tcp://user@localhost:9007/amps/json
In the example above, spark
was able to successfully log onto the AMPS instance that was located on port 9007
.
%> ./spark ping -server localhost:9119
Unable to connect to AMPS
(com.crankuptheamps.client.exception.ConnectionRefusedException: Unable to
connect to AMPS at localhost:9119).
In the example above, spark
was not able to successfully log onto the AMPS instance that was located on port 9119
. The error shows the exception thrown by spark
, which in this case was a ConnectionRefusedException
from Java.
Spark Authentication
spark
includes a way to provide credentials to AMPS for use with instances that are configured to require authentication. For example, to use a specific user ID and password to authenticate to AMPS, simply provide them in the URI in the format user:password@host:port
.
The command below shows how to use spark to subscribe to a server, providing the specified username and password to AMPS.
$AMPS_HOME/bin/spark subscribe -type json \
-server username:password@localhost:9007
AMPS also provides the ability to implement custom authentication, and many production deployments use customized authentication methods. To support this, the spark
authentication scheme is customizable. By default, the authentication scheme used by spark
simply provides the username and password from the -server
parameter, as described above.
Authentication schemes for spark
are implemented in Java, as classes that implement Authenticator
-- the same method used by the AMPS Java client. To use a different authentication scheme with spark
, you implement the AuthenticatorFactory
interface in spark
to return your custom authenticator, adjust the CLASSPATH to include the .jar
file that contains the authenticator, and then provide the name of your AuthenticatorFactory
on the command line. See the AMPS Java Client API documentation for details on implementing a custom Authenticator
.
The command below explicitly loads the default factory, found in the spark
package, without adjusting the CLASSPATH.
$AMPS_HOME/bin/spark subscribe –server username:password@localhost:9007 \
-type json -topic foo \
-authenticator com.crankuptheamps.spark.DefaultAuthenticatorFactory
Spark Support for TCPS
spark
supports secure connections over SSL for AMPS TCPS transports. Details about the available spark
command options are provided in the tables above.
spark
recognizes the AMPS_SPARK_OPTS
environment variable for passing Java properties to the underlying JVM. This is needed for SSL properties such as -Djavax.net.ssl.trustStore
and -Djavax.net.ssl.trustStorePassword
.
Typically, it is sufficient to set the trust store JVM properties in the AMPS_SPARK_OPTS
environment variable and specify the secure
option with a valid value. If secure
is set to true
, yes
, or 1
, the connection will use the tcps
URI scheme; otherwise, it will use tcp
.
%> export AMPS_SPARK_OPTS="-Djavax.net.ssl.trustStore=./cacerts -Djavax.net.ssl.trustStorePassword=changeit"
%> ./spark subscribe -secure 1 -server localhost:10110 -type json -topic Orders
The example above places a subscription on the Orders topic over a tcps
connection.
If additional customization of the AMPS connection URI is required, the urischeme
and uriopts
options can be specified. For example, a custom AMPS Client Transport can be created and associated with a specific URI scheme by defining a custom Transport class and mapping it to the chosen URI scheme name.
In the example below, a custom AMPS Client Transport is associated with the URI scheme protected
:
%> export AMPS_SPARK_OPTS="-Djavax.net.ssl.trustStore=./cacerts -Djavax.net.ssl.trustStorePassword=changeit -cp secure.jar:spark.jar"
%> ./spark subscribe -urischeme protected -server localhost:10110 -type json -topic Orders
This will generate the following URI:
protected://localhost:10110/amps/json
In this case you would specify the option urischeme
, which refers to a custom AMPS Client Transport. The custom class would need to be added to the Java classpath using AMPS_SPARK_OPTS
. If both urischeme
and secure
were provided in this example, urischeme
would override secure
.
For more information on properties to define to customize SSL connections, see the SSL section in the Advanced Topics chapter of the AMPS Java Developer Guide.
For information on URI parameters that can be used to customize spark
connections, using the uriopts
option, see the Connection Parameters for AMPS chapter of the AMPS Java Developer Guide.