Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Current »

How does the client know how many connections exist at a given time?

  • A client can determine the number of connections to all of their organization’s streams by using the GetStreams call or isolating to just one stream by using the GetStreams By ID call. In both of those calls a count of ‘members’ will be returned. The member count is the number of connections to that stream at the time the call was made. If there are 0 members/connections then you are not ingesting data from the stream at that time.

    • Remember when testing that opening a main stream URL in a browser tab will count as a connection and will engage load balancing. Placing main stream URLs in browser tabs is not recommended.

    • As noted above, “member” is the number of connections to that stream. It is not related to the number of users or patients in the org.

What SSE (Server Sent Events) listener should we use to ingest from Streaming API?

  • Validic doesn’t maintain an SSE listener.

  • Any standard SSE listener will work with the Streaming API

  • It is the client’s responsibility to either build an SSE listener or use an open-source SSE listener that works for the code language the client’s team is familiar with. A Google search for SSE listeners will return a lot of options available out there. Determining the right one for your code base is research the client should do.

How do I know what is a new data event and what is an update to an existing data event?

  • This element of the Streaming API documentation should be reviewed.

  • The id in the data event will persist for each update that comes through for that record.

  • The checksum will be unique for each update to a data event id

  • The client should have logic to check if the id was previously processed. If so, then you have an update to that data event coming through. The client should write the values for the most recent checksum for a given id to their database. The most recent can be determined by the created_at in the data event payload.

  • This is most commonly seen with Summary records as they are updated throughout the entire day, but can be seen with other data types.

What happens if we create 2 main streams for the same org?

  • In the scenario where there are two main streams created for the same org, then the streams will be duplicates of each other if the event_type_filter and resource_filter are the same for both streams.

  • Some clients in the past have used two main streams in scenarios like the following:

    • I want to ingest summary data through one stream and measurement data through another.

    • I want to ingest data events through one stream and connection events through another.

  • Clients should review their use cases to see if utilizing two streams is needed. If you are ingesting all data and connection events through one stream then duplicating that stream configuration should be avoided

  • No labels