The Challenges of Delivering Tech Healthcare
A key challenge to implementing technological change within the healthcare sector is ensuring that it is achieved within the heavily regulated legal framework.
“This is appropriate for an emerging industry, where it’s still unclear which regulations will be useful or effective,” says David Talby, CTO of Pacific AI – a consulting firm that specialises in AI, big data and data science. “The current regulations [are] focused on known and obvious risks – like privacy and safety. Case law and regulation in other areas – like explainability [for AI], for example – will gradually evolve over time as people better understand the technology and its potential implications.”
And the stakes are inevitably high in healthcare. While it might be acceptable for Netflix to make sub-optimal recommendations based on its algorithms, for example, a medical AI tool with a statistically high failure rate could very quickly put human lives at risk.
Utilising AI on reliable data
AI is only as good as its training data. If an algorithm (to detect skin cancer, for example) has exclusively been trained with images of white people then it may struggle when it comes to other skin tones. Furthermore, AI services have an “uncanny valley” effect – if they’re close to perfect but just miss the mark, then there’s the risk that user trust collapses along with the utility of the platform. For all of these reasons, it’s likely that AI will complement human experts – rather than replacing them – for the foreseeable future.
So does this mean that there’s a more measured pace of development in the sector? “The product release cycle will be different when there are regulatory or safety risks,” says David. “Pushing daily software updates to production is a great idea for ecommerce sites but a horrible one for pacemakers. While compliance makes life harder, I’m in favor of slower progress in exchange for a much higher standard of privacy.”
Ensuring the security of the healthcare data
And considering the sensitive nature of medical data, privacy and cybersecurity are of paramount importance (and particularly so when potentially insecure IoT devices are added to the mix). Vast organisations such as the NHS with multiple systems, networks, and a varying degrees of system security demonstrates the vulnerability of medical IT if exposed to potential cyber hacks.
Nevertheless, with the right resources, it’s a problem that can be addressed, says David. “Security and compliance are very solvable problems – it takes effort and knowledge to properly set up a secure AI platform, but it’s perfectly doable,” he says.
Browse Our Latest Tech RolesCurrent Vacancies
Empiric is a multi-award winning business and one of the fastest growing technology and transformation recruitment agency's specialising in data, digital, cloud and security. We supply technology and change recruitment services to businesses looking for both contract and permanent professionals.
Read more (pdf download)
Empiric are committed to changing the gender and diversity imbalance within the technology sector. In addition to Next Tech Girls we proactively target skilled professionals from minority groups which in turn can help you meet your own diversity commitments. Our active investment within the tech community allows us to engage with specific talent pools and deliver a short list of relevant and diverse candidates.
For more information contact
02036757777To view our latest job opportunities click here.