Atomic Query and Subscribe
When a topic is recorded in the SOW, an application can request the current state of the topic and simultaneously subscribe to updates from the topic. In this case, AMPS first delivers all of the messages that match the query and then provides any update to a record that matches the query. AMPS guarantees that no updates are missed or duplicated between the query and the subscription. As with a simple query, AMPS will test each message currently in the SOW against the content filter specified and all messages matching the filter will be returned to the client. When the query begins, AMPS enters a subscription with the provided filter. After the query completes, AMPS delivers messages from the subscription. In the event that a record is updated while the query is running, AMPS saves the update and delivers it immediately after the query completes.
As with a simple SOW query, the topic can be a literal topic name or a regular expression pattern. For more information on issuing queries, please see Querying the State of the World in the AMPS User Guide.
Spark: Basic SOW Query and Subscribe Example
Here's how to use spark
to query the current state of an AMPS SOW topic and subscribe to updates.
This example assumes that:
You have configured a topic named
test-sow
in the AMPS server of message type JSON.The
test-sow
topic uses the/id
field of the message to calculate the key for the topic.
To retrieve the current state of the topic and subscribe, an application issues the sow_and_subscribe
command. Since the command includes a subscription, the command stays active until it is explicitly stopped (or the application disconnects).
First, publish a message or two to the test-sow
topic:
Open a new terminal in your Linux environment.
Use the following command (with
AMPS_DIR
set to the directory where you installed AMPS) to send a single message to AMPS:spark
automatically connects to AMPS and sends a logon command with the default credentials (the current username and an empty password). With thepublish
command,spark
reads the message from the standard input and publishes the message to the JSON topictest-sow
. The command produces output similar to the following line (the rate calculation will likely be different:When the publisher sends the message, AMPS parses the message to determine the value of the Key fields in the message, and then either inserts the message for that key, or overwrites the existing message with that key.
You can publish any number of messages this way. Each distinct
id
value will create a distinct record in the topic.
Next, retrieve the current contents of the topic:
Open a new terminal in your Linux environment.
Use the following command (with
AMPS_DIR
set to the directory where you installed AMPS) to retrieve the contents of the topic:spark
automatically connects to AMPS and sends a logon command with the default credentials (the current username and an empty password).spark
then sends thesow_and_subscribe
command to AMPS. This command requests the current contents of thetest-sow
topic and creates a subscription to the topic.spark
shows the current contents of the topic. Notice that the output is strictly the message data, separated by newline characters.spark
does not show any of the metadata for a message.spark
remains running after the query completes, waiting for new publishes to arrive.
Publish more messages (or updates to the existing messages) to the topic. In the terminal you opened to publish the first messages:
Use the following command (with
AMPS_DIR
set to the directory where you installed AMPS) to send a message to AMPS:Notice that the subscription receives the message.
If you close the subscriber and re-run it, you will see that the second time the subscriber runs, it receives the updated messages in the query and, again, waits for changes to arrive.
Last updated