Working with data to maintain high food standards
The role of the Food Standards Agency is to safeguard public health and protect the interests of consumers in relation to food, working closely with the UK Government as well as governments in Wales and Northern Ireland. As our food system continues to evolve, the FSA’s strategy needs to anticipate change. New technologies, business models and changing consumer behaviours, means that the FSA needs to think differently about how it meets its objectives. The use of technology and the analysis of data plays a critical role in ensuring that our food system can be trusted, as Julie Pierce explains.
How do you ensure UK food safety when so much produce is now imported?
When it comes to food safety it’s important to understand that there is no single global food standard, so right from the off it can be difficult to judge the quality of produce by comparison between one territory and another. However, there are standards for food safety. We have an organisation called the Codex Alimentarius Commissionv (CAC) which was established by the World Health Organisation (WHO) and the Food and Agriculture Organization of the United Nations (FAO) to protect consumer health and promote fair practices in food trade. Every country is represented at CAC in some form or other. And then there are other standards bodies for safety that most big businesses and exporters would have signed up to, plus a whole process of inspection across the various countries. As soon as a food item hits UK shores then it has to follow UK regulations, and there are a lot of different conversations going on about the different aspects of that food, relating to safety, authenticity and traceability, as well as wider risks around ongoing availability. These conversations and insights are all made possible by the collection and interrogation of data.
How is technology helping to improve processes for assessing risk?
Data is used on a daily basis to monitor what is going on across the world with about 40 regulator organisations like ours talking to each other about the risks that they see relating to the food in their country. There is a great deal of data to analyse, and technology can be used to scrape and translate all of these different information sources to determine whether certain foods are likely to come to the UK or not, and if they are coming to UK shores, to determine any risk factors. Lists of ingredients are included in this data, and again by checking, cross referencing and comparing data, an ingredient such as ‘egg’, for example, can more readily be clarified. Does it mean chicken egg, duck egg, fish egg even? I’d say that every 24 hours we are analysing brand new data that comes in, interpreting it and drawing key insights, before presenting that to relevant parties.
How was this carried out before the availability of predictive analytics and AI?
Well, it was very slow, and we would receive a report once a month to say, ‘the risk of pork from such and such a place has increased’, for example. We didn’t do it ourselves, we bought in that service, and there would be people sifting through that data every day and processing it. So, being able to do this much faster and have those reports daily rather than monthly is incredible. It’s very responsive and we can keep refining it now that we have the solution set up. We can add more sources, change weightings in relation to importance, and really refine it to enable us to drill down and extract the most key information. So, there is a lot that we can now tell about the importing of food based on data and our ability to interpret it with speed and accuracy.
How do you go about identifying those food businesses that are high risk?
We’ve done a fair amount of exploration around the use of predictive analytics to prioritise inspections or try to work out whether a food business is high risk or not based on more than just what they declare to us. So, certain things make you high risk: for example, if you’re operating as a butcher, that immediately flags you as a business with a certain degree of risk, because you’re dealing directly with raw produce and its preparation. But beyond that other factors such as location, and other indicators, either directly or indirectly determine your level of risk. And again, we piloted the development of a model where a score is determined by adding up risk factors, with those most high-risk businesses being immediately flagged to the local authority inspectors because they require urgent attention.
Do you come across any problems regarding data quality, and how can these be overcome?
People think that poor data quality is because the data was mis-entered into a system or the system itself broke it! The harder challenge is where the definition and the meaning of the data is not clear, and you have to try and determine the original purpose of information by finding out what people said and what they really meant. We regularly sit down with Food Standards Scotland and have conversations globally with all of the other regulators around the world and with every single local authority in the country. Sometimes it’s necessary to check their interpretation of any data that might seem ambiguous to make sure that we all agree on its meaning. As a separate issue, we have also spent quite some time trying to identify data biases and ascertain their impact. We know that IT systems didn’t put the biases there and that they are purely present because of the way that such systems have learned from humans. So, AI as a tool is a very clever thing, but it’s everything around it; the quality of the data, the bias of the data, and also the willingness of the people to trust and act on the data that’s been delivered to help them via such means. And the FSA is interested in the quality of the data above whether we have the latest shiny AI thing, because unless you address the quality of the data it becomes all about the front end and the sparkly stuff, and behind that it’s just a mess. You need that solid and reliable data foundation.
Is an increased reliance on technology readily embraced by your network of stakeholders or is there resistance?
There is some resistance, depending on who we’re working with, and there can be an insistence on doing things a certain way because that’s how it’s always been done, even when that might involve more manual processes. There is also an element of fear associated with digging around in the data because of commercial sensitivities or GDPR, but we’re not actually dealing with any personally identifiable information at all. We are only concerned with facts and figures, timings, locations, hygiene ratings, food commodities, products and so on.
And are there any challenges related to the interpretation of complex information?
There is a lot of data there that we’re interested in, and we collect high quantities of information. One of the challenges is the sharing of the data through the system. Some of the supply chains are very long and some of them very complex — and because suppliers and customers come and go and are not static, this also adds an extra dimension of complexity, as does the fact that companies are increasingly globally operating. This is something that needs careful management and full transparency.