Big data is future for better food safety
Nestlé is already collecting such ‘big data’ from its businesses around the world and has plans to analyse this more effectively to enable it to identify emerging threats and business opportunities.
John O’Brien, head of food safety and quality at Nestlé’s Research Centre, speaking at the Institute of Food Science & Technology’s spring conference titled ‘Food safety in the court of public opinion’ held in London last Friday (April 23), remarked on the “huge opportunities” presented by interrogating so-called ‘big data’ (the huge amounts of electronic data collected by companies).
Nestlé has past experience of handling huge IT projects, having implemented its ‘Project Globe’ to implement business best practices based on installing the SAP enterprise resource planning system as the standard platform in its businesses around the world several years ago, which O’Brien described as “the largest IT project in the world”.
‘Nestlé 100M analytical tests per year’
“We generate in Nestlé 100M analytical tests per year, this is just an example,” said O’Brien.
“The data today are not mined as sufficiently as it could and I think there is going to be an increasing opportunity to use big data analysis where you have your primary data, which could be analytical, and map that with metadata [information about that data] which could give you GIS-type information [geographical information system data], for example, or factory information. That’s when it gets interesting.
“Equally, the data that will be generated using technologies like whole genome sequencing and other analytical technology will have to be managed in some way.” O’Brien suggested the challenge would be what information companies like Nestlé extract from it.
He also argued for access to be provided to such big data by relevant stakeholders to enable the sharing of important information, specifically on food safety.
“We need to start to explore repositories of data which we access to, both from a public point of view and from an industrial point of view,” said O’Brien.
“We’ve started that discussion. We’re keen on it; everyone will benefit if the data are managed in the right way, there are huge opportunities to be gained. We can discover information that today cannot be seen.”
Big data fellow at Turing Institute
O’Brien’s comments were warmly welcomed by the FSA’s chief scientific adviser Guy Poppy.
“This is music to my ears,” said Poppy. “I have been in the FSA seven months and I’ve had a personal mission to drive the big data agenda. And we’re on the verge of announcing a big data fellow based at the Turing Institute [the data sciences organisation named after World War II code breaker Alan Turing] in London using big data approaches looking at public health issues to do with food.”
Poppy added: “[Big data] is the future. It addresses the issues of corporate moral responsibility as well as corporate business. It enables people to trust what is going; bring confidence and that type of thing. It really is very significant. With lots of data captured in potentially different ways, it is elucidating big signal from [background] noise.”
Poppy also referred to a “big piece of work” that had been done by the FSA on Twitter feeds that the Department of Health was planning to take forward. “[With] our work on Twitter feeds and norovirus, we have been able to pick up norovirus outbreaks four weeks before Public Health England and epidemiological aspects have.”
Poppy invited others interested in sharing big data to contact him.
The British Retail Consortium’s (BRC’s) technical director David Brackston responded that his organisation was prepared to share non-commercially confidential data it held derived globally from food hygiene audits against its BRC Global scheme.