We’ve got a nice project which went live today – we’ve been providing the final performance analysis for the £17m government-funded Retrofit for the Future programme, which has been running since 2009. The programme retrofitted 120 homes in the UK with multiple energy-saving and energy-efficiency technologies, and collected data through over 1300 sensors and meters taking readings every 5 minutes. The aim of the programme was to inform the planning and development of retrofit and new-build homes to help the UK meet its energy goals. They launched the final results at Ecobuild today, and they’re also up on this website.
As you might imagine, that many meter readings accumulate into a dataset which is pretty big, and it’s been a lot of fun trying to wrangle it into shape. The sensors have come in all shapes and sizes, and all levels of reliability, so (as always) a major part of the work has been in cleaning and making sense of the data, before actually being able to do a lot with it. The good news is that the results are great – an average reduction in emissions of over 60%, with some homes reducing by more than 80% (fivefold!). There’s also a load more we could do with the same dataset – we haven’t even begun to dig into the behavioural and other factors that are revealed by this very fine-grained information.
So for now, take a look at the report, and you can register to gain your own access to the raw data at https://est.amee.com/. We’ll be releasing the analysis code shortly, as well.
More recent and upcoming talks from Team Mastodon:
A couple of recent ones that are up online:
Using data for EVIL! at the Strata Data Warfare conference
Big Data comes to the NHS at ODI Fridays lunchtime lecture
More public talks coming soon:
Prescribing Analytics got an awful lot of positive press coverage over the last few weeks.
It’s very exciting to see something we’ve made be covered so extensively – it was on the front page of the Independent, on the PM programme on Radio 4 (47 mins in), the Economist, the FT, and even in the Revolution Analytics R blog as well as various medical and pharma blogs.
It’s been interesting the different ways this piece of work has been interpreted by different journalists. In the text on the site, and when being interviewed, we were very careful to emphasise the context of the data and that NHS prescribing advisers already do a lot of work on this issue. Most of the responses we had seemed to understand this very well, and it’s been great to see how many NHS employees have been in touch both to say that they enjoy it and to suggest future developments that would be helpful. We did get a few sensational headlines, but most of the coverage was brilliantly balanced, with Radio 4 and the Economist being especially impressive in the care they took to ensure they portrayed the issue fairly.
Anyway, the amount of value that this small project has been able to create makes it pretty clear that there’s a lot more to be done with public health data. It also makes me realise that people loooove maps as a data visualisation. Lesson for every project there, fit in a map if you can – it’s fun and interesting to play with, and people pay much more attention to the resulting data as a result.
We’ve finally published the prescribing analytics portal here
It covers prescribing variation of statins in the UK last year, making it really easy to explore variation which could be costing extra money. I definitely recommend taking a look – it’s fascinating to explore.
It got some great coverage too – the Financial Times, the Economist, and Huffington Post all had approving mentions.
Last week, the NHS Information Centre released another quarter of a billion rows of data, so there’s a lot more to be done here, and this kind of big data is exactly our bag. Watch this space!
Don’t tell anyone yet, but we’ve got a big project going live soon which aims to help the NHS save an awful lot of money. It turns out that by altering GPs’ prescribing behaviour for a few drugs, swapping generic for proprietary forms where appropriate, it’s possible to save hundreds of millions of pounds a year – working with the doctors at Open Healthcare UK, we’ve clarified how prescriptions can and should safely be changed, done the detailed financial analysis, and created maps and rankings of exactly which GPs are spending what.
The full details will all be published in due course, but actually doing the project has reminded me of a few basic principles that come up again and again:
1. Good domain knowledge usually beats super-smart algorithms. Working with qualified doctors and using their understanding of the ins and outs of prescribing behaviour was massively more important to this project than the mathematical and analytical aspects.
2. The effort of getting data doesn’t necessarily correlate with its importance. This whole project was based on open data from the NHS Information Centre, free for anyone to download – acquisition was easy, it’s actually doing something with it that was the harder part.
From the Bethnal Green Ventures Demo Day – my 5 minute version of what we do, told mostly through the medium of Muppets.