Snap recently announced an upgrade to it’s ‘Scan’ feature. Now, I don’t personally use Snap, but according to Verge, the upgrade make it a whole lot more useful.
Apart from the more obvious use for shopping which many rivals do as well (i.e. point camera at an item of clothing and it helping you discover and buy similar clothes), it has what is described as ‘Camera shortcuts’. This feature provides context based recommendations – so if you point it towards the sky, it selects relevant audio clips and visual filters, and allows you to easily apply them. Described in the Verge article as still in its infancy and a bit ‘hit and miss’, the potential could be quite interesting (and useful) with improvements that are sure to follow and the integration into the Snap Spectacles.
So what’s the relevance to healthcare?
Users in healthcare – whether they be service users, their families and carers, healthcare professionals or administrators – have to interact with various systems to provide information, check information / results and ‘order’ actions or services. Often this means using systems that are not designed with the specific user in mind (this situation is improving), having to switch between systems for different scenarios or the process being more difficult than it needs to be.
One potential option is to build one system that brings everything together or at least provides access to all the ‘underlying systems’ in a streamlined and easy manner. It can be quite tricky to bring all systems together and means that the ‘single’ interface may still not be great when considering the various user groups and their needs.
Let them choose
Another potential option is if the ‘end points’, at which users interact with the systems, are numerous and not ‘fixed’ – taking into account various user groups, their needs and preferences (and what they are already well equipped to do)?
For example, Snap users love communicating through images – so what if they can use their record their calorie intake by simply taking a photo or update their mood diary by taking a photo of themselves (perhaps with some filters to fine tune the ‘update’)? What if health professionals can update a patient’s record in a EHR or look up a patient’s test results through an App they love using (and have configured to suit their preferences to save time) instead of having to use the default EHR app?
Why don’t we separate the health applications and related data from the ‘user interaction’ elements so these can be ‘mixed and matched’?
Obviously, there are several considerations and challenges to think through – 3 key items below:
- Open ecosystems with separation of ‘data’ and ‘application’ layers
- Technical data standards to ensure data sharing is meaningful and accurate, and robust governance to ensure user data privacy is maintained (and that data is shared appropriately)
- The right incentives and commercial framework to ensure new entrants have a chance and that incumbents see value in being part of the ecosystem
Very happy to hear your comments below or feel free to email me to share ideas – firstname.lastname@example.org