A big data analytics approach to quality, reliability and risk management
Nowadays, everything around us produces Big Data (BD). The digital process and the social media exchange provide it, while the communication and sensor systems transmit it. The development of smartphones and mobile devices increased the amount of it. Thus, multiple sources provide BD at an alarming velocity, volume and variety. Due to this, it is necessary to adopt optimal processing power, analytic capabilities and skills, in order to extract meaningful value from them ( Janssen et al., 2017). BD has rapidly moved to be a mainstream activity in the organisations, changing the way people within organisations work together. The culture in which business and IT leaders have to join forces to achieve value from all data is changing. Tapping into large-scale, fast-moving and complex streams of data sets has the potential to transform the way organisations take their decisions. On the other hand, increasing demand for insights requires a new approach for defining tools and practices. In the current competitive business and industrial environment, top management has to be fully knowledgeable about new thinking, techniques and developments in the field. According to Zhang et al. (2017), the emerging advanced technologies related to the identification process, wireless sensors, radio frequency identification, communication technologies and information network technologies, have created the new era of the Internet of Things (IoT). Subsequently, Zhang et al. (2017) asserted that IoT offers an IT infrastructure to assist the information interchange of “things and processes” in a real-time and reliable manner. Thus, data represent the connecting bridge between cyber and physical world. However, as suggested by Berti-Équille (2007), it is important to highlight that a significant amount of data is not relevant if their quality is not taken into consideration in the analysis. In fact, the impact of low data quality on the results validity and interpretations of BD processes leads to the conclusion that every designed approach have to ensure data quality and accuracy. Many studies show techniques to evaluate the data quality in the world of IoT, as summarised by Karkouch et al. (2016). Moreover, the different nature and relevance of the available data require a particular attention in terms of security and privacy, for most application domains such as personal, home, government, enterprise and industry domain (Ouaddah et al., 2017). Nevertheless, according to Wu et al. (2017), this enormous set of data and the continuous improvement of the information technology highlight the difficulties to manage properly the available information. However, there is the consciousness of the increasing relevance of this new approach in the activities management. In fact, in the last few years, BD and IoT have been rapidly gaining ground in many sectors. Zhao et al. (2017) adopted the BD approach for developing a multi-objective optimisation model in the green supply chain management (SCM) and, in particular, for minimising the inherent risk caused by hazardous materials. The BD approach defined the guidelines for data acquisition within the entire considered supply chain (SC) and for the data quality control. For these reasons, Lillrank (2003) elaborated standard definitions of quality and suggested guidelines for methodology development. In particular, he postulated the distinction between information-as-artefacts and information-as-deliverables. Furthermore, he highlighted the need to improve the integration of BD science in the SCM sector. Gunasekaran et al. (2017) underlined the relevance of BD for achieving business value and firm performance to be used jointly with a predictive analysis. Their study answered some research questions regarding the relation between BD and top management commitment, supply chain connectivity and information sharing.