Chapter Seven
Analysis by Research Activity
Medical research can be grouped into three broad categories, based on what they are examining. Of the awards assessed in this report:
-
51% were for Cause – looking at normal function and what causes diseases to arise
-
40% were for Cure – looking at ways to detect diseases, and develop and test new treatments
-
9% were for Care – looking at how disease and their outcomes can be managed for the person, and at the healthcare level
It would be expected that different aspects of research would lead to different impacts, so as well as examining the outputs of charity-funded research across the 5 areas of impact, it is also interesting to see if there were any patterns that linked the kind of research being funded with the kinds of impact that were reported.
Research activity
The average number of outputs varied across the different output types, so to compare, we need to look at the trend for each output. We have used two graphs to show this – one for outputs where the average was above 1.0 (Figure 18), and one for where it was below 1.0 (figure 19).
Figure 18. Average number of outputs per research activity where average output is greater than 1
Figure 19. Average number of outputs per research activity where average
number of outputs is less than 1
Cause was the most productive research activity followed by cure and then care.
Cause research activities led to the highest average number of:
- Publications
- Awards and recognitions
- Tools and methods
- Spin outs
- Examples of further funding
- Next destination and skills
Cure research activities led to the highest average number of:
- Databases and models
- Medical products
- Protected and licensed intellectual properties
- Technical products
- Partnerships
Care research activities led to the highest average number of:
- Policy influences
- Engagement activities
This preliminary information may help funders to check the linkage between research areas and their expected outcomes.
Analysis by type of award
Funders use a number of types of awards to support research, ranging from short ‘pilot’ grants to large multi-project ‘programme’ grants, and from studentships to support trainee researchers ‘PhD awards’ to ‘chairs’ supporting senior academics. As part of this analysis, we were keen to examine whether any trends in outputs reported for different types of awards. Of the portfolio:
- 55% were awarded for projects (short and long term, small and large scale)
- 20% supported people (studentships, fellowships, senior awards)
- 3% for infrastructure (equipment, stand-alone units
For many funders, project and people awards are the major way in which they give money. Comparing the average number of outputs for both award types, we can see that in most areas of impact, there is not much difference between the two types. However, projects have a higher average number of partnerships and on average people awards had slightly more instances of further funding.
Figure 20. Average number of outputs for Projects and People grant
types.
When did impact happen?
We know that there will be a delay between when a new award starts and when the research leads to an outcome; the data collected by charity-funded researchers allows us to see the trends. In this section, we have shown the average number of outputs in each category in each year since the award started.
Generating new knowledge
Figure 21. Average number of publications generated after award started
The average number of publications increased each year, peaking at 8-9 years after award date. This may reflect the small number of ‘older’ awards that were reported on, or may be due to researchers considering awards ‘inactive’.
Figure 22. Average number of research tools and methods generated after
award started
When the number of awards in each year is taken into account, tools and methods are more likely to appear in the sixth year, with a second peak at 10 years after award.
Figure 23. Average number of databases and models generated after award
started
When the number of awards in each year is taken into account, databases and models are more likely to appear in the third year, with a second peak at 8 years after award.
Translating research
Figure 24. Average number of registered, protected and licensed
intellectual properties, per award, generated after award started
The average number of registered, protected and licensed intellectual properties increased year on year peaking at 9 years after the awards started.
Figure 25. Average number of spin out companies generated after award
started
When the number of awards in each year is taken into account, spin out companies are more likely to appear in the fifth year, with a second peak at 10 years after award.
Figure 26. Average number of developed and tested medical products and
interventions generated after award started
When the number of awards in each year is taken into account, medical products are more likely to appear in the tenth year.
Figure 27. Average number of software and technical products generated
after award started
When the number of awards in each year is taken into account, technical products are more likely to appear in the ninth year.
Figure 28. Average number of policy and practice influences generated
after award started
The average number of influences of policy increased year on year peaking at 10 years after the awards started.
Figure 29. Average number of engagement activities generated after award
started
When the number of awards in each year is taken into account, engagement activities are more likely to appear in the seventh year and with a second peak at 10 years after award.
Figure 30. Average number of partnerships generated after award started
Partnerships appear to happen earlier in the lifetime of an award than most other outputs, reaching a peak and constant rate at 3-4 years.
Looking at these ‘time to impact’ graphs, it is clear that:
-
the lag time varies for different types of impact
-
most ‘peaks’ are seen 5 years or more after the award started, indicating that funders and researchers need to track impact over a long period in order to see the impact of research
-
as many of the awards in this analysis were in their 4th year, it is likely that the picture will change overtime, strengthening any preliminary trends seen here.
Chapters
OverviewExecutive Summary Chapter One
Introduction and Context Chapter Two
Generating New Knowledge Chapter Three
Translating Research Ideas Into Products and Services Chapter Four
Creating Evidence That Will Influence Policy or Other Stakeholders Chapter Five
Stimulating Further Research via New Funding or Partnerships Chapter Six
Developing the Human Capacity to Do Research Chapter Seven
Analysis by Research Activity, Type of Award and Time Taken Chapter Eight
Discussion Chapter Nine
Case Studies Chapter Ten
Acknowledgements Appendices
Appendix 1 Appendices
Appendix 2 Appendices
Appendix 3 Appendices
Appendix 4