While the public claim concern for their privacy, they frequently appear to overlook it. This disparity between concern and behaviour is known as the Privacy Paradox. Such issues are particularly prevalent on wearable devices. These products can store personal data, such as text messages and contact details. However, owners rarely use protective features. Educational games can be effective in encouraging changes in behaviour. Therefore, we developed the first privacy game for (Android) Wear OS watches. 10 participants used smartwatches for two months, allowing their high-level settings to be monitored. Five individuals were randomly assigned to our treatment group, and they played a dynamically-customised privacy-themed game. To minimise confounding variables, the other five received the same app but lacking the privacy topic. The treatment group improved their protection, with their usage of screen locks significantly increasing (p = 0.043). In contrast, 80% of the control group continued to never restrict their settings. After the posttest phase, we evaluated behavioural rationale through semi-structured interviews. Privacy concerns became more nuanced in the treatment group, with opinions aligning with behaviour. Actions appeared influenced primarily by three factors: convenience, privacy salience and data sensitivity. This is the first smartwatch game to encourage privacy-protective behaviour.
The public claim to be concerned about privacy, as suggested by a range of polls and surveys (Morar Consulting, 2016; Pike, Kelledy, & Gelnaw, 2017). However, we frequently exhibit behaviour which places our data at risk (Beresford,Kübler, & Preibusch, 2012; Felt et al., 2012). This disparity between claimed concern and empirical action is known as the Privacy Paradox (Norberg, Horne, & Horne, 2007). The situation often arises through a lack of awareness (Deuker, 2009). This poses a particular risk to wearables, which are both novel and unfamiliar (Williams, Nurse, & Creese, 2017). Smartwatches offer exciting functionality, providing interactive apps and online connectivity. They can also store a variety of personal data, from text messages to contact details (Do, Martini, & Choo, 2017). Despite this, users rarely use available settings to protect their privacy (Udoh & Alkharashi, 2016). This has led to the Privacy Paradox being prevalent in this environment (Williams et al., 2017). Previous work has suggested that this issue can be mitigated by increasing awareness (Deuker, 2009). Therefore, many studies have sought to educate users on privacy matters (Kelley, Bresee, Cranor, & Reeder, 2009; Hélou, Guandouz,& Aïmeur, 2012). Unfortunately, highlighting a problem is often not sufficient to change behaviour (Bada, Sasse, & Nurse, 2015). Since privacy is rarely a primary goal (Hughes-Roberts & Furnell, 2015), individuals might lack the motivation to protect their data. If we hope to incentivise protection, privacy should be aligned with user wants (Dolan, Hallsworth, Halpern, King, & Vlaev, 2010). Rather than mandating compliance, we can then highlight the empowering aspects of protection. Serious games embed incentives within interactivity, using positive reinforcement to instil knowledge (Kumar, 2013).