The risks to mental health posed by our use of smartphones, social media, and similar tech-related aspects of modern life are well-reported. 

At the same time, many organisations are already providing or developing tools which use technology to support and promote mental health and wellbeing, ranging from meditation apps to AI-powered online counselling.

And as individuals and businesses become increasingly aware of the importance of mental health, the prevalence of such tech solutions is set to grow.  

The government is conducting a three-year research project into such 'digital mental health technologies' (DMHT) and is considering the most appropriate model of regulation.

 In this context, the Medicines and Healthcare Products Regulatory Agency (MHRA) recently published a report summarising the outcome of its latest phase of research into public awareness of DMHT.  

The full text of the MHRA report is at Digital mental health technology: user and public perspectives - GOV.UK (  

In this note, our life sciences and commercial team discuss the report's background, summarise some of its findings, and consider the potential implications for businesses operating in or considering entering this space.   

Contact Our Life Sciences Solicitors

The background – 'Software as a Medical Device'

The government's review of DMHT takes place in the context of a broader project focused on the role played by software (including AI) in health and social care generally – the Software and AI as a Medical Device Change Programme(the Change Programme), which was originally announced in September 2021.

The Change Programme considers, among other things, the interfaces between software products and the regulatory regime for 'medical devices' under the Medical Device Regulations 2002 (as amended).

Its objective is to provide clarity for patients, suppliers, and healthcare providers, as well as ensure that patients are adequately protected.  

Guidance is already available for manufacturers of general health-related technology solutions (including standalone software products such as apps), which may be subject to the medical device regulatory regime if they have what is defined as 'a medical purpose'.    

Software and technologies focused on mental health form a sub-set of such technologies and raised specific considerations which do not fall neatly within existing guidance. 

With that in mind, the MHRA, working in partnership with the National Institute for Health and Care Excellence (NICE) and with funding from the Wellcome Trust, embarked in 2023 on a three-year project to formulate guidance and other sources of information for developers, healthcare professionals, patients and the public, to clarify the regulatory and evaluation requirements for DMHT.  

The output of the first work package in that project is the report titled Digital Mental Health Technology: User and Public Perspectives, which was published in April 2024 and is discussed below.

Get In Touch With Myerson Solicitors

Perspectives on DMHTs

The report provides an interesting snapshot of public awareness and opinion in respect of DMHT and their regulation, as well as some insights into public perception of the tech sector more generally.

The writers of the report noted that, among their respondents, there was a perception of 'a growing demand for mental health services', and a sense that 'capacity could not keep up, leading to long waiting times and less effective support'.

Responses suggested a view that since 'unlike many physical conditions, mental health was often perceived to be very individual in terms of causes, treatment and support', the ideal was 'therefore an integrated package of care designed for each patient'.

In respect of experience with DMHT, around half of respondents had used apps to support their mental health, but mostly these were 'basic products such as mood trackers, sleep and relaxation and meditation apps'.

While respondents recognised that DMHT could 'make a valuable contribution' within mental health care and support, they were also conscious of the risk 'they would be used to try and cover over failings in the mental health care system'.

In particular, they noted that if DMHT were being relied on by anyone with 'a serious mental health condition', the apps 'should provide a route for users to connect rapidly to a healthcare professional to seek help' and should also alert a healthcare professional if a user 'displays worrying behaviour'. 

Perspectives on DMHTs

On the question of how best to regulate DMHT, the writers of the report noted that respondents generally found it difficult to engage on the topic. 

In some cases, this was because the respondents assumed DMHT was not regulated and, like apps generally, was more of a 'wild west" – an interesting perspective, given the significant regulatory checks that must already be satisfied by providers of 'Software as a Medical Device'.  

While respondents were generally relaxed about the risk profile of DMHT and the manner in which they might be regulated, the two key exceptions were 'data security' and 'ensuring that any interaction with mental health professionals should be fully regulated'.

Recent press coverage of cyber-criminals who blackmailed their victims using stolen records of confidential conversations between patients and their psychotherapists supports both concerns.  

Speak With Our Life Sciences Team

Considerations for providers

As its title indicates, the report focuses on 'User and Public Perspectives' on DMHT, and it represents an early phase in the wider project.

As well as reporting current opinion, however, it does give some indications of potential regulatory conclusions, which may be of interest to businesses which may, in due course, be subject to that regulatory regime.  

  • The function of the specific technology was naturally viewed as significant to the level and type of regulation that would be required.  
  • For example, apps focused on 'care' (i.e. general mental wellbeing) were viewed differently from those which might be held out as a 'cure' for an identified condition (with the latter requiring more stringent regulation).  
  • Similarly, an app that replaced a physical item or a process (such as 'a self-help book, journal, yoga class or meditation session') was viewed differently from an app that replaced interaction with a human (e.g. diagnosis, triage, therapy or case management') with the latter again representing a higher risk and greater regulatory concern. 
  • The public were also concerned about the need for guidance for those involved in advising in the design or use of DMHT, and considered this 'as much if not more of a priority than guidance for the public'.  
  • Regulation could be focused on developers of DMHT, with a requirement to ensure that risks associated with use of such technologies are mitigated by design, for example by ensuring that a user accessing an AI-enabled counselling platform gets access to a health care professional when required.
  • Ensuring that users of DMHT are properly informed about the technology was viewed as important, and the obligation to provide this information is a potential regulatory requirement for providers.  
  • Such information might include in what circumstances and in what way the product could help the user, any warning signs that might arise from its use, such as anxiety, and how to get help if the product was not providing the care and support they needed, or might be causing harm to the user.

These are initial indicators only, though, and the main message for providers is to continue to monitor developments as the project progresses. 

Contact Our Life Sciences Lawyers

Contact Our Life Sciences Team

If you have any questions or would like more information regarding this article and how this may affect your business, please contact our Life Science Solicitors, who would be happy to assist.