Handling Duplicate Timestamps in SDK Readings
Overview
Our SDK occasionally retrieves readings with identical timestamps. This document outlines the causes of this phenomenon and provides recommendations for handling such scenarios.
Causes of Duplicate Timestamps
Historical Data Retrieval: The SDK will set the time on the device during the pairing process, but when retrieving historical data from a device that had no time set previously, multiple readings may be assigned the same timestamp if the device was used previously and stores historical readings.
Device Precision Limitations: Some devices (e.g., glucometers) may only report time to the minute, not milliseconds, potentially resulting in duplicate timestamps for rapid consecutive readings. This could be a user education opportunity to guide users to take time between readings to avoid readings within the same minute for these devices.
Rapid Consecutive Measurements: Certain devices, particularly weight scales, may produce multiple readings in quick succession as users step on and off the scale. This is also a user education opportunity to ensure the user is following best practices on taking a weight reading to ensure accurate readings.
If you are seeing this with blood pressure readings the user should be reminded that time should be allowed for the cuff to deflate fully to ensure accurate readings. 5 minutes between readings would be best practice to allow for a full delation of the cuff.
Recommended Handling Strategies
Strategy 1: Selective Storage
When encountering multiple readings with the same timestamp:
Store all readings in your database
Implement a mechanism to flag or identify which reading should be displayed
Develop a user interface that allows users to select the correct reading
Pros:
Preserves all data
Allows for user verification
Maintains data integrity
Cons:
Requires additional storage
Necessitates user interaction for resolution
Strategy 2: Ignore Duplicates
When multiple readings with the same timestamp are detected:
Discard all readings with the duplicate timestamp
Log the occurrence for monitoring and analysis purposes
Pros:
Simpler implementation
Avoids potential data inconsistencies
Cons:
Potential loss of valid data
May require an explanation to end-users about missing data points
Implementation Considerations
Database Design:
If implementing Strategy 1, ensure your database schema can accommodate flags or indicators for preferred readings.
User Experience:
For Strategy 1, design a clear and intuitive interface for users to select the correct reading.
For Strategy 2, consider how to communicate to users that some data points may be omitted due to timestamp conflicts.
Monitoring and Logging:
Implement logging for occurrences of duplicate timestamps to track frequency and patterns.
Data Integrity:
Ensure that your chosen strategy is consistently applied across all relevant parts of your application.
Conclusion
The choice between these strategies depends on your specific use case, data integrity requirements, and user experience considerations. We recommend thoroughly testing your chosen approach and monitoring its effectiveness over time.
For further assistance or clarification, please contact our support team at support@validic.com