Atomic Query and Subscribe

When a topic is recorded in the SOW, an application can request the current state of the topic and simultaneously subscribe to updates from the topic. In this case, AMPS first delivers all of the messages that match the query and then provides any update to a record that matches the query. AMPS guarantees that no updates are missed or duplicated between the query and the subscription. As with a simple query, AMPS will test each message currently in the SOW against the content filter specified and all messages matching the filter will be returned to the client. When the query begins, AMPS enters a subscription with the provided filter. After the query completes, AMPS delivers messages from the subscription. In the event that a record is updated while the query is running, AMPS saves the update and delivers it immediately after the query completes.

As with a simple SOW query, the topic can be a literal topic name or a regular expression pattern. For more information on issuing queries, please see Querying the State of the World in the AMPS User Guide.

Spark: Basic SOW Query and Subscribe Example

Here's how to use spark to query the current state of an AMPS SOW topic and subscribe to updates.

This example assumes that:

  • You have configured a topic named test-sow in the AMPS server of message type JSON.

  • The test-sow topic uses the /id field of the message to calculate the key for the topic.

To retrieve the current state of the topic and subscribe, an application issues the sow_and_subscribe command. Since the command includes a subscription, the command stays active until it is explicitly stopped (or the application disconnects).

First, publish a message or two to the test-sow topic:

  1. Open a new terminal in your Linux environment.

  2. Use the following command (with AMPS_DIR set to the directory where you installed AMPS) to send a single message to AMPS:

    $ echo '{"id":1,"note":"Crank it up with a SOW!"}' | \
      $AMPS_DIR/bin/spark publish -server localhost:9007 \
      -type json -topic test-sow
  3. spark automatically connects to AMPS and sends a logon command with the default credentials (the current username and an empty password). With the publish command, spark reads the message from the standard input and publishes the message to the JSON topic test-sow. The command produces output similar to the following line (the rate calculation will likely be different:

    total messages published: 1 (333.33/s)
  4. When the publisher sends the message, AMPS parses the message to determine the value of the Key fields in the message, and then either inserts the message for that key, or overwrites the existing message with that key.

  5. You can publish any number of messages this way. Each distinct id value will create a distinct record in the topic.

Next, retrieve the current contents of the topic:

  1. Open a new terminal in your Linux environment.

  2. Use the following command (with AMPS_DIR set to the directory where you installed AMPS) to retrieve the contents of the topic:

    $ $AMPS_DIR/bin/spark sow_and_subscribe -server localhost:9007 \
      -type json -topic test-sow
  3. spark automatically connects to AMPS and sends a logon command with the default credentials (the current username and an empty password). spark then sends the sow_and_subscribe command to AMPS. This command requests the current contents of the test-sow topic and creates a subscription to the topic.

  4. spark shows the current contents of the topic. Notice that the output is strictly the message data, separated by newline characters. spark does not show any of the metadata for a message.

  5. spark remains running after the query completes, waiting for new publishes to arrive.

Publish more messages (or updates to the existing messages) to the topic. In the terminal you opened to publish the first messages:

  1. Use the following command (with AMPS_DIR set to the directory where you installed AMPS) to send a message to AMPS:

    $ echo '{"id":1,"note":"Crank it up with a SOW!"}' | \
      $AMPS_DIR/bin/spark publish -server localhost:9007 \
      -type json -topic test-sow
  2. Notice that the subscription receives the message.

If you close the subscriber and re-run it, you will see that the second time the subscriber runs, it receives the updated messages in the query and, again, waits for changes to arrive.

Last updated

Copyright 2013-2024 60East Technologies, Inc.