LogoLogo
AMPS Server Documentation 5.3.4
AMPS Server Documentation 5.3.4
  • Welcome to AMPS 5.3.4
  • Introduction to AMPS
    • Overview of AMPS
    • Getting Started With AMPS
      • Installing AMPS
      • Starting AMPS
      • JSON Messages - A Quick Primer
      • spark: the AMPS command-line client
      • Evaluating AMPS on Windows or MacOS
      • Galvanometer and RESTful Statistics
    • AMPS Basics: Subscribe and Publish to Topics
    • State of the World (SOW): The Message Database
      • When Should I Store a Topic in the SOW?
      • How Does the SOW Work?
      • Configuration
      • Queries
      • Atomic Query and Subscribe
      • Advanced Messaging and the SOW
    • Record and Replay Messages with the AMPS Transaction Log
    • Message Queues
    • Scenario and Feature Reference
      • Recovery Strategies
    • Getting Support
    • Advanced Topics
    • Next Steps
  • AMPS Evaluation Guide
    • Introduction
    • Evaluation and Development with AMPS
    • Tips on Measuring Performance
    • Next Steps
  • AMPS User Guide
    • Introduction
      • Product Overview
      • Requirements
      • Organization of this Guide
        • Documentation Conventions
      • Technical Support
    • Installing and Starting AMPS
      • Installing AMPS
      • Starting AMPS
      • Production Configuration
    • Subscribe and Publish
      • Topics
      • Filtering Subscriptions by Content
      • Conflated Subscriptions
      • Replacing Subscriptions
      • Messages in AMPS
      • Message Ordering
      • Retrieving Part of a Message
    • AMPS Expressions
      • Syntax
      • Identifiers
      • AMPS Data Types
      • Grouping and Order of Evaluation
      • Logical Operators
      • Arithmetic Operators
      • Comparison Operators
      • LIKE Operator
      • Conditional Operators
      • Working with Arrays
      • Regular Expressions
      • Performance Considerations
    • AMPS Functions
      • AMPS Function Overview
      • String Comparison Functions
      • Concatenating Strings
      • Managing String Case
      • Replacing Text in Strings
      • String Manipulation Functions
      • Date and Time Functions
      • Array Reduce Functions
      • Geospatial Functions
      • Numeric Functions
      • CRC Functions
      • Message Functions
      • Client Functions
      • Coalesce Function
      • AMPS Information Functions
      • Typed Value Creation
      • Constructing Fields
      • Aggregate Functions
    • State of the World (SOW) Topics
      • How Does the SOW Work?
      • Using the State of the World
      • Understanding SOW Keys
      • Indexing SOW Topics
      • Programmatically Deleting Records from the Topic State
      • SOW Maintenance
        • Creating a Maintenance Schedule for a Topic
        • Setting Per-Message Lifetime
      • Storing Multiple Logical Topics in One Physical Topic
    • Querying the State of the World (SOW)
      • Overview of SOW Queries
      • Query and Subscribe
      • Historical SOW Topic Queries
      • Managing Result Sets
      • Batching Query Results
    • Out-of-Focus Messages (OOF)
    • State of the World Message Enrichment
    • Incremental Message Updates
      • Using Delta Publish
      • Understanding Delta Publish
      • Delta Publish Support
    • Receiving Only Updated Fields
      • Using Delta Subscribe
      • Identifying Changed Records
      • Conflated Subscriptions and Delta Subscribe
      • Select List and Delta Subscribe
      • Options for Delta Subscribe
    • Conflated Topics
    • Aggregation and Analytics
      • Understanding Views
      • Defining Views and Aggregations
      • Constructing Field Contents
      • Best Practices for Views
      • View Examples
      • Aggregated Subscriptions
    • Record and Replay Messages
      • Using the Transaction Log and Bookmark Subscriptions
      • Understanding Message Persistence
      • Configuring a Transaction Log
      • Replaying Messages with Bookmark Subscription
      • Managing Journal Files
      • Using amps-grep to Search the Journal
    • Message Queues
      • Getting Started with AMPS Queues
      • Understanding AMPS Queuing
      • Advanced Messaging and Queues
      • Replacing Queue Subscriptions
      • Handling Unprocessed Messages
      • Advanced Queue Configuration
      • Queue Subscriptions Compared to Bookmark Replays
    • Message Types
      • Default Message Types
      • BFlat Messages
      • MessagePack Messages
      • Composite Messages
      • Protobuf Message Types
      • Struct Message Types
    • Command Acknowledgment
      • Requesting Acknowledgments
      • Receiving Acknowledgments
      • Bookmark Subscriptions and Completed Acknowledgments
      • Bookmark Subscriptions and Persisted Acknowledgments
      • Acknowledgment Conflation and Publish Acknowledgements
    • Transports
      • Client Connections
      • Replication Connections
      • Transport Filters
    • Running AMPS as a Linux Service
      • Installing the Service
      • Configuring the Service
      • Managing the Service
      • Uninstalling the Service
    • Logging
      • Configuring Logging
      • Log Message Format
      • Message Levels
      • Message Categories
      • Logging to a File
      • Logging to a Compressed File
      • Logging to Syslog
      • Logging to the Console
      • Looking up Errors with ampserr
    • Event Topics
      • Client Status Events
      • SOW Statistics Events
      • Persisting Event Topics
    • Utilities
      • Command-Line Basic Client
      • Dump clients.ack File
      • Dump journal File
      • Dump queues.ack File
      • Dump SOW File
      • Dump Journal Topic Index File
      • Find Bookmark or Transaction ID in Transaction Log
      • Find Information in Error Log or Transaction Log
      • Identify Type of AMPS File
      • List/Explain Error Codes
      • Query Statistics Database
      • Statistics Database Report
      • Storage Performance Testing
      • Submit Minidump to 60East
      • Obsolete Utility: Upgrade File Formats
    • Monitoring AMPS
      • Statistics Collection
        • Time Range Selection
        • Output Formatting
      • Galvanometer
      • Configuring Monitoring
    • Automating AMPS with Actions
    • Replicating Messages Between Instances
      • Replication Basics
      • Configuring Replication
      • Replication Configuration Validation
      • Replication Resynchronization
      • Replication Compression
      • Destination Server Failover
      • Two-Way Replication
      • PassThrough Replication
      • Guarantees on Ordering
      • Replication Security
      • Understanding Replication Message Routing
      • Replicated Queues
      • Replication Best Practices
    • Highly Available AMPS Installations
      • Overview of High Availability
        • Example: Pair of Instances for Failover
        • Example: Regional Distribution
        • Example: Regional Distribution with HA
        • Example: Hub and Spoke / Expandable Mesh
      • Details of High Availability
      • Slow Client Management and Capacity Limits
      • Message Ordering Considerations
    • Operation and Deployment
      • Capacity Planning
      • Linux OS Settings
      • Upgrading AMPS
      • Using AMPS with a Proxy
      • Operations Best Practices
    • Securing AMPS
      • Authentication
      • Entitlement
      • Providing an Identity for Outbound Connections
      • Protecting Data in Transit Using TLS/SSL
    • Troubleshooting AMPS
      • Planning for Troubleshooting
      • Diagnostic Utilities
      • Finding Information in the Log
      • Reading Replication Log Messages
      • Troubleshooting Disconnected Clients
      • Troubleshooting Regular Expression Subscriptions
    • AMPS Distribution Layout
    • Optionally-Loaded Modules
      • Optional Functions
        • Legacy Messaging Functions
        • Special-Purpose Functions
      • Optional SOW Key Generator
        • Chaining Key Generator
      • Optional Authentication/Entitlements Modules
        • RESTful Authentication and Entitlements
        • Multimethod Authentication Module
        • Simple Access Entitlements Module
      • Optional Authenticator Modules
        • Multimethod Authenticator
        • Command Execution Authenticator
    • AMPS Statistics
    • File Format Versions
  • AMPS Configuration Guide
    • AMPS Configuration Basics
      • Getting Started With AMPS Configuration
      • Units, Intervals, and Environment Variables
      • Working With Configuration Files
      • Including External Files
    • Instance Level Configuration
    • Admin Server and Statistics
    • Modules
    • Message Types
    • Transports
    • Logging
    • State of the World (SOW)
      • SOW/Topic
      • SOW/*Queue
      • SOW/ConflatedTopic
      • SOW/View
    • Replication
      • Replication Validation
    • Transaction Log
    • Authentication
    • Entitlement
    • Actions
      • Configuration for Actions
      • Choosing When an Action Runs
        • On a Schedule
        • On AMPS Startup or Shutdown
        • On a Linux Signal
        • On a REST Request
        • On Minidump Creation
        • On Client Connect or Disconnect
        • On Client Logon
        • On Client Offline Message Buffering
        • On Subscribe or Unsubscribe
        • On Incoming Replication Connections
        • On Outgoing Replication Connections
        • On Message Published to AMPS
        • On Message Delivered to Subscriber
        • On Message Affinity
        • On SOW Message Expiration
        • On SOW Message Delete
        • On OOF Message
        • On Message Condition Timeout
        • On Message State Change
        • On a Custom Event
      • Choosing What an Action Does
        • Rotate Error/Event Log
        • Compress Files
        • Truncate Statistics
        • Manage Transaction Log Journal Files
        • Remove Files
        • Delete SOW Messages
        • Compact SOW Topic
        • Query SOW Topic
        • Manage Security
        • Enable or Disable Transports
        • Publish Message
        • Manage Replication Acknowledgment
        • Extract Values from a Message
        • Translate Data Within an Action
        • Increment Counter
        • Raise a Custom Event
        • Execute System Command
        • Manage Queue Transfers
        • Create Minidump
        • Shut Down AMPS
        • Debug Action Configuration
      • Conditionally Stopping an Action
        • Based on File System Capacity
        • Based on an Expression
      • Examples of Action Configuration
        • Archive Journals Once a Week
        • Archive Journals On RESTful Command
        • Record Expired Queue Messages to a Dead Letter Topic
        • Copy Messages that Exceed a Timeout to a Different Topic
        • Deactivate and Reactivate Security on Signals
        • Reset Entitlements for a Disconnected Client
        • Extract Values from a Published Message
        • Shut Down AMPS When a Filesystem Is Full
        • Increment a Counter and Echo a Message
    • Protocols
  • AMPS Monitoring Guide
    • Statistics Types
    • Table Reference
    • Administrative Actions
    • Host Statistics
      • cpu
      • disks
      • memory
      • name
      • network
    • AMPS Instance Statistics
      • api
      • clients
      • config.xml
      • config_path
      • conflated_topics
      • cpu
      • cwd
      • description
      • environment
      • lifetimes
      • logging
      • memory
      • message_types
      • name
      • name_hash
      • pid
      • processors
      • queues
      • queries
      • replication
      • sow
      • statistics
      • subscriptions
      • timestamp
      • transaction_log
      • transports
      • tuning
      • uptime
      • user_id
      • version
      • views
  • AMPS Command Reference
    • Commands to AMPS
      • logon
      • Publishing
        • publish
        • delta_publish
      • Subscribing to and Querying Topics
        • subscribe
        • sow
        • sow_and_subscribe
        • unsubscribe
        • delta_subscribe
        • sow_and_delta_subscribe
      • Removing Messages (SOW/Topic or Message Queue)
      • heartbeat
      • flush
    • Responses from AMPS
      • sow: Content from Server
      • publish: Content from Server
      • oof: Content from Server
      • ack: Status from Server
      • group_begin / group_end : Result Set Delimiters
    • Protocol Reference
      • AMPS Protocol
      • Legacy Protocols Reference
    • Command Cookbook
      • Cookbook: Delta Publish
      • Cookbook: Delta Subscribe
      • Cookbook: Publish
      • Cookbook: SOW
      • Cookbook: SOW and Delta Subscribe
      • Cookbook: SOW and Subscribe
      • Cookbook: SOW Delete
      • Cookbook: Subscribe
  • Deployment Checklist
    • Ensure Sufficient Capacity
    • Apply System and AMPS Configuration
    • Create Maintenance Plan
    • Create Monitoring Strategy
    • Create Patch and Upgrade Plan
    • Create and Test Support Process
    • Conclusion
  • AMPS Clients
    • Performance Tips and Best Practices
    • C++
    • C#/.NET
    • Java
    • JavaScript
    • Python
Powered by GitBook

Get Help

  • FAQ
  • Legacy Documentation
  • Support / Contact Us

Get AMPS

  • Evaluate
  • Develop

60East Resources

  • Website
  • Privacy Policy

Copyright 2013-2024 60East Technologies, Inc.

On this page
  • Getting Help with Spark
  • Spark Commands
  • Publish
  • SOW
  • Subscribe
  • sow_and_subscribe
  • sow_delete
  • Ping
  • Spark Authentication
Export as PDF
  1. AMPS User Guide
  2. Utilities

Command-Line Basic Client

AMPS contains a command-line client, spark, which can be used to run queries, place subscriptions, and publish data. While it can be used for each of these purposes, spark is provided as a useful tool for informal testing and troubleshooting of AMPS instances. For example, you can use spark to test whether an AMPS instance is reachable from a particular system, or use spark to perform ad hoc queries to inspect the data in AMPS.

This chapter describes the commands available in the spark utility. For more information on the features available in AMPS, see the relevant chapters in this guide.

The spark utility is included in the bin directory of the AMPS install location. The spark client is written in Java, so running spark requires a Java Virtual Machine for Java 1.7 or later.

To run this client, simply type ./bin/spark at the command line from the AMPS installation directory. AMPS will output the help screen as shown below, with a brief description of the spark client features.

%> ./bin/spark
===============================
- Spark - AMPS client utility -
===============================
Usage:

    spark help [command]

Supported Commands:

    help
    ping
    publish
    sow
    sow_and_subscribe
    sow_delete
    subscribe

Example:

    %> ./spark help sow

Returns the help and usage information for the 'sow' command.

Getting Help with Spark

spark requires that a supported command is passed as an argument. Within each supported command, there are additional unique requirements and options available to change the behavior of spark and how it interacts with the AMPS engine.

For example, if more information was needed to run a publish command in spark, the following would display the help screen for the spark client's publish feature.

%>./spark help publish
===============================
- Spark - AMPS client utility -
===============================
Usage:

  spark publish [options]

Required Parameters:

  server    -- AMPS server to connect to
  topic     -- topic to publish to

Options:

  authenticator -- Custom AMPS authenticator factory to use
  delimiter     -- decimal value of message separator character
                   (default 10)
  delta         -- use delta publish
  file          -- file to publish records from, standard in when omitted
  proto         -- protocol to use (amps, fix, nvfix, xml)
                   (type, prot are synonyms for backward compatibility)
                   (default: amps)
  rate          -- decimal value used to send messages
                   at a fixed rate.  '.25' implies 1 message every
                   4 seconds. '1000' implies 1000 messages per second.

Example:

  % ./spark publish -server localhost:9003 -topic Trades -file data.fix

    Connects to the AMPS instance listening on port 9003 and publishes records
    found in the 'data.fix' file to topic 'Trades'.

Spark Commands

Below, the commands supported by spark will be shown, along with some examples of how to use the various commands and descriptions of the most commonly used options. For the full range of options provided by spark, including options provided for compatibility with previous spark releases, use the spark help command as described above.

Publish

The publish command is used to publish data to a topic on an AMPS server.

Common Options - spark publish

Option

Definition

server (required)

AMPS server to connect to.

topic (required)

Topic to publish to.

delimiter

Decimal value of message separator character (default 10).

delta

Use delta publish (sends a delta_publish command to AMPS).

file

File to publish messages from, stdin when omitted. spark interprets each line in the input as a message.

The file provided to this argument can be either uncompressed or compressed in ZIP format.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

rate

Messages to publish per second.

This is a decimal value, so values less than 1 can be provided to create a delay of more than a second between messages. '.25' implies 1 message every 4 seconds. '1000' implies 1000 messages per second.

type

For protocols and transports that accept multiple message types on a given transport, specifies the message type to use.

Examples

The examples shown below will demonstrate how to publish records to AMPS using the spark client in one of the three following ways: a single record, a python script or by file.

Publish a Single Message

%> echo '{ "id" : 1, "data": "hello, world!" }' |  \
   ./spark publish -server localhost:9007 -type json -topic order

   total messages published: 1 (50.00/s)

In the example above, a single record is published to AMPS using the echo command. If you are comfortable with creating records by hand this is a simple and effective way to test publishing in AMPS.

The JSON message is published to the topic order on the AMPS instance. This publish can be followed with a sow command in spark to test if the record was indeed published to the order topic.

Publish using Python

%> python -c "for n in xrange(100): print '{\"id\":%d}' % n" | \
   ./spark publish -topic disorder -type json -rate 50 \
   -server localhost:9007

   total messages published: 100 (50.00/s)

In the example above, the -c flag is used to pass in a simple loop and print command to the python interpreter and have it print the results to stdout.

The python script generates 100 JSON messages of the form {"id":0}, {"id":1} ... {"id":99}. The output of this command is then piped to spark using the | character, which will publish the messages to the disorder topic inside the AMPS instance.

Publish from a File

%> ./spark publish -server localhost:9007 -type json -topic chaos \
   -file data.json

   total messages published: 50 (12000.00/s)

Generating a file of test data is a common way to test AMPS functionality. The example above demonstrates how to publish a file of data to the topic chaos in an AMPS server. As previously mentioned, spark interprets each line of the file as a distinct message.

SOW

The sow command allows a spark client to query the latest messages which have been persisted to a topic. The SOW in AMPS acts as a database last update cache, and the sow command in spark is one of the ways to query the database. This sow command supports regular expression topic matching and content filtering, which allow a query to be very specific when looking for data.

For the sow command to succeed, the topic queried must provide a SOW. This includes SOW topics and views, queues, and conflated topics. These features of AMPS are discussed in more detail in this guide.

Common Options - spark sow

Option

Definition

server (required)

AMPS server to connect to.

topic (required)

Topic to query.

batchsize

Batch Size to use during query.

copy

Publishes records to the secondary server specified.

filter

The content filter to use.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

orderby

An expression that AMPS will use to order the results.

topn

Request AMPS to limit the query response to the first N records returned.

type

For protocols and transports that accept multiple message types on a given transport, specifies the message type to use.

format

Optional format used for displaying messages. May contain literal separator characters mixed with format tags.

Allowed tags are:

{bookmark}, {command}, {correlation_id}, {data}, {expiration}, {lease_period}, {length}, {sowkey}, {user_id}, {timestamp}, {topic}

Example: -format "{command}:{data}"

Examples

%> ./spark sow -server localhost:9007 -type json -topic order \
   -filter "/id = '1'"

{ "id" : 1, "data" : "hello, world" }
Total messages received: 1 (Infinity/s)

This sow command will query the order topic and filter results which match the xpath expression /id = '1'. This query will return the results in the topic, for example, the record published in the previous publish command.

If the topic does not provide a SOW, the command returns an error indicating that the command is not valid for that topic.

Subscribe

The subscribe command allows a spark client to query all incoming messages to a topic in real time. Similar to the sow command, the subscribe command supports regular expression topic matching and content filtering, which allow a query to be very specific when looking for data as it is published to AMPS. Unlike the sow command, a subscription can be placed on a topic which does not have a persistent SOW cache configured. This allows a subscribe command to be very flexible in the messages it can be configured to receive.

Common Options - spark subscribe

Option

Definition

server (required)

AMPS server to connect to.

topic (required)

Topic to subscribe to.

copy

Publishes records to the secondary server specified.

delta

Use delta subscription (sends a delta_subscribe command to AMPS).

filter

Content filter to use.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

ack

Enable acknowledgments when receiving from a queue.

backlog

type

For protocols and transports that accept multiple message types on a given transport, specifies the message type to use.

format

Optional format used for displaying messages. May contain literal separator characters mixed with format tags.

Allowed tags are:

{bookmark}, {command}, {correlation_id}, {data}, {expiration}, {lease_period}, {length}, {sowkey}, {user_id}, {timestamp}, {topic}

Example: -format "{command}:{data}"

Examples

 %> ./spark subscribe -server localhost:9007 -topic chaos \
                        -type json -filter "/name = 'cup'"

{ "name" : "cup", "place" : "cupboard" }

The example above places a subscription on the chaos topic with a filter that will only return results for messages where /name = 'cup'. If we place this subscription before executing the publish command, in the publish records from a file example, then we will get the results listed above.

sow_and_subscribe

The sow_and_subscribe command is a combination of the sow command and the subscribe command. When a sow_and_subscribe is requested, AMPS will first return all messages which match the query and are stored in the SOW. Once this has completed, all messages which match the subscription query will then be sent to the client.

The sow_and_subscribe is a powerful tool to use when it is necessary to examine both the contents of the SOW, and the live subscription stream.

Common Options - spark sow_and_subscribe

Option

Definition

server (required)

AMPS server to connect to.

topic (required)

Topic to query and subscribe to.

batchsize

Batch size to use during query.

copy

Publishes records to the secondary server specified.

delta

Request delta for subscriptions (sends a sow_and_delta_subscribe command to AMPS).

filter

Content filter to use.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

orderby

An expression that AMPS will use to order the SOW query results.

topn

Request AMPS to limit the SOW query results to the first N records returned.

type

For protocols and transports that accept multiple message types on a given transport, specifies the message type to use.

format

Optional format used for displaying messages. May contain literal separator characters mixed with format tags.

Allowed tags are:

{bookmark}, {command}, {correlation_id}, {data}, {expiration}, {lease_period}, {length}, {sowkey}, {user_id}, {timestamp}, {topic}

Example: -format "{command}:{data}"

Examples

%> ./spark sow_and_subscribe -server localhost:9007 -type json \
                               -topic chaos -filter "/name = 'cup'"

{ "name" : "cup", "place" : "cupboard" }

In the previous example, the same topic and filter are being used as in the sow_and_subscribe example above. The results of this query initially are similar, since only the messages which are stored in the SOW are returned. If a publisher were started that published data to the topic that matched the content filter, those messages would then be printed out to the screen in the same manner as a subscription.

sow_delete

The sow_delete command is used to remove records from the SOW topic in AMPS. If a filter is specified, only messages which match the filter will be removed. If a file is provided, the command reads messages from the file and sends those messages to AMPS. AMPS will delete the matching messages from the SOW. If no filter or file is specified, the command reads messages from standard input (one per line) and sends those messages to AMPS for deletion.

It can be useful to test a filter by first using the desired filter in a sow command and making sure the records returned match what is expected. If that is successful, then it is safe to use the filter for a sow_delete. Once records are deleted from the SOW, they are not recoverable.

Common Options - sow_delete

Option

Definition

server (required)

AMPS server to connect to.

topic (required)

Topic to delete records from.

filter

Content filter to use.

Notice that a filter of 1=1 is true for every message and will delete the entire set of records in the SOW.

file

File from which to read messages to be deleted.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

type

For protocols and transports that accept multiple message types on a given transport, specifies the message type to use.

Examples

%> ./spark sow_delete -server localhost:9007 \
   -topic chaos -type json -filter "/name = 'cup'"

   Deleted 1 records in 10ms.

With the sow_delete command above, we are asking for AMPS to delete records in the topic chaos which match the filter /name = 'cup'. In this example, we delete the record we published and queried previously in the publish and sow_and_subscribe examples, respectively. spark reports that one matching message was removed from the SOW topic.

Ping

The spark ping command is used to connect to the amps instance and attempt to logon. This tool is useful to determine if an AMPS instance is running and responsive.

Common Options - spark ping

Option

Definition

server (required)

AMPS server to connect to.

proto

Protocol to use.

In this release, spark supports amps, fix, nvfix and xml.

Defaults to amps. spark also supports json as a synonym for amps in this release.

Examples

%> ./spark ping -server localhost:9007 -type json
Successfully connected to tcp://user@localhost:9007/amps/json

In the example above, spark was able to successfully log onto the AMPS instance that was located on port 9007.

%> ./spark ping -server localhost:9119
Unable to connect to AMPS
(com.crankuptheamps.client.exception.ConnectionRefusedException: Unable to
connect to AMPS at localhost:9119).

In the example above, spark was not able to successfully log onto the AMPS instance that was located on port 9119. The error shows the exception thrown by spark, which in this case was a ConnectionRefusedException from Java.

Spark Authentication

spark includes a way to provide credentials to AMPS for use with instances that are configured to require authentication. For example, to use a specific user ID and password to authenticate to AMPS, simply provide them in the URI in the format user:password@host:port.

The command below shows how to use spark to subscribe to a server, providing the specified username and password to AMPS.

$AMPS_HOME/bin/spark subscribe -type json \
                               -server username:password@localhost:9007

AMPS also provides the ability to implement custom authentication, and many production deployments use customized authentication methods. To support this, the spark authentication scheme is customizable. By default, the authentication scheme used by spark simply provides the username and password from the -server parameter, as described above.

Authentication schemes for spark are implemented in Java, as classes that implement Authenticator -- the same method used by the AMPS Java client. To use a different authentication scheme with spark, you implement the AuthenticatorFactory interface in spark to return your custom authenticator, adjust the CLASSPATH to include the .jar file that contains the authenticator, and then provide the name of your AuthenticatorFactory on the command line. See the AMPS Java Client API documentation for details on implementing a custom Authenticator.

The command below explicitly loads the default factory, found in the spark package, without adjusting the CLASSPATH.

$AMPS_HOME/bin/spark subscribe –server username:password@localhost:9007 \
                               -type json -topic foo \
      -authenticator com.crankuptheamps.spark.DefaultAuthenticatorFactory
PreviousUtilitiesNextDump clients.ack File

Last updated 1 year ago

A batch size > 1 can help improve performance, as described in the chapter of this guide.

Notice that not all headers may be available on every request, depending on the options provided to the request. See the for details.

Notice that, when this option is provided, spark acknowledges messages from the queue, signaling to AMPS that the message has been fully processed. (See the chapter on in this guide for more information.)

Request a max_backlog of greater than 1 when receiving from a queue. (See the chapter on in this guide for more information.)

Notice that not all headers may be available on every request, depending on the options provided to the request. See the for details.

Notice that not all headers may be available on every request, depending on the options provided to the request. See the for details.

Querying the State of the World
AMPS Command Reference
Message Queues
Message Queues
AMPS Command Reference
AMPS Command Reference