AI uses trends found within the examples it’s given. Repeating those trends is learn from to at a later data to form predictions and make decisions. This process called Machine Learning or ML.
In a world where human behaviour has remained relatively unchanged, or where any changes have been gradual, this works very well. Being able to forecast trends making accurate predictions or replicate complex decisions by comparing scenario to the examples it was trained on. This has allowed almost every industry to easily adopt AI automation.
However, COVID-19 has forced people to drastically change their lifestyle. We now travel, socialise, work and spend money completely differently and this drastic change in user behaviour has already started to affect the accuracy of the AI we interact with daily. This is because these AI models were trained on data sets gathered from a world before COVID.
This does not mean that planes will suddenly fall out of the sky and this COVID-19 data obsolescence will not affect applications like handwriting transcription software “OCR”. However, anyone who has used a SAT NAV post COVID will have already experienced this discrepancy with years of historical traffic patterns misaligning to live traffic conditions.
Larger datasets collected over a period of time are best for use in AI, so why don’t we just keep old data and adjust it to reflect the data we have collected since COVID? Unfortunately not, as the global community adjusts to the “new normal”, these new datasets will themselves become outdated. Furthermore, as the pandemic continues to evolve, our behaviour will too; this means that any data that has been collected since the start of the pandemic could also become outdated quicker. This affect is so pronounced that if not accounted for, it will continue to affect the accuracy of AI models that have not yet be created.
This change is the new normal accelerated obsolescence of data is going to hit just about every industry that uses AI, as well as the AI itself. Existing AI systems will become outdated as they continue to make predictions for a pre-COVID world, which in itself will have many real-world implications.
For example, companies that rely on AI to tap into customer spending habits may find their profits fluctuate, whereas regular travellers who rely on satellite navigation systems could receive incorrect arrival time predictions.
Simply put, it will make existing AI inaccurate and no longer fit for purpose.
Without shifting AI systems in line with the shift in global behaviour, not only will these systems lose efficiency, organisations that continue to use the inaccurate forecasts will see their services fall behind the curve.
But wait a minute, doesn’t all data go out of date surely this isn't a new problem? As we say at INEVITABLE Well yes and no. In Machine Learning (ML) we often apply a weighting to adjust for this half-life of Data. Meaning that the further back you go in a Data set the less influence that data point has on the resulting AI. However, half-life depreciation this does not account for the suddenness of the changes or uneven distribution of the affect associated with COVID.
Combatting this issue and its effects will require an overhaul of all affected AI and AI dependant systems. This might even require a shift from the normal (train algorithm on data, export model for use) method of creating AI. Instead, shifting to AI that trains and auto-update itself on the fly. The vast majority of AI systems currently in place are not auto-updating, meaning that supervised reinforcement is required as the dataset changes. Furthermore, AI systems will need to become more data-efficient and make predictions from datasets spanning significantly smaller time periods.
Necessity is the mother of invention; as we pull together in the fight against COVID, we see many cases were this adversity has been a catalyst for innovation. For example, healthTech is an industry that, understandably, has been quick to adapt. NHS trusts across the UK have implemented AI-powered digital workforces developed by Blue Prism to meet increased operational demand. BenevolentAI adapted their drug discovery technology to learn more about the body’s response to the disease, leading to the Rheumatoid Arthritis drug Baricitinib being trialled as a potential treatment.
The solution to these issues is simple: newer, more efficient algorithms. Algorithms that can auto-update and make smarter predictions from less data will mitigate any potential risks of using smaller datasets created from an ever-changing world. The creation and use of such algorithms will need to take a more integrated approach too: using data science to mount and use models effectively as well as ensuring greater care is taken to ensure that the dates, context and influences are recorded and accounted for. When in doubt, seek advice, don’t risk it.
open source AI that is to say AI based on data arrogated from public or open sources. Will feel the impact of COVID-19 in the same way. Therefore, those working with open data will need to pay extra attention to their models. The dates attached to datasets will need to be recorded in the meta data or documentation. If your project relies on an untagged AI model, unit test is to see if it's still viable.
Much like every other area of life, reacting to COVID-19 requires a quick adjustment within AI, an adjustment that will feel forced or unnatural. However, with change being the only constant as we leave the treat of lockdown, it is important for systems to keep up with the ever-evolving landscape to ensure their continued efficacy, accuracy and safety. So even if they appear to be unaffected, it's better to check them over for. It’s like taking the car out after a long time on sitting on the drive. By checking you’ve either just bought yourself piece of mind, or just saved yourself a lot of money.
If you believe your business could benefit from a review or redevelopment of its current AI systems, talk to us.
We love talking all things tech,
So whether you're: