The Developing Communimetric Community
by Dan Warner
Executive Director and Founder, Community Data Roundtable
Originally posted June 19, 2017 on the CDR Blog. Click here to view more of their posts and follow their work.
I first met John Lyons in 2011 when he was doing a training in Pennsylvania. Instantly impressed by his vision for how to measure outcomes in social services, I started up a conversation with him after his lecture, on the business model he was using to roll-out CANS. I was confused: he doesn’t have an app? He gives away the form and manual for free? There isn’t a licensing fee hidden somewhere?
“No,” John explained to me, “and it’s not that I’m giving it away, either.” John went on to explain that he is sourcing all of his users. Trainers have to submit vignettes he can use for training, new CANS versions will improve the item definitions, analytic techniques he could never dream of will be used and provide insight far beyond his original vision. The “TCOM” approach is built more on the model of Linux, Open Office, and Craigslist, where a central body gets things started and tries to organize the efforts of thousands of participants. These independent efforts advance the field through their own application of the original ideas, allowing a whole ecosystem to emerge that is more creative and robust than a centralized model can accomplish.
One place that we see the power of this approach, is in the ongoing work people are doing across the globe in finding a right way to measure the “bottom line” outcome of a communimetric tool. To get the conversation started, the Praed Foundation has put out two versions of their recommended “Reports Suite” which shows some basic, powerful ways that communimetric data can be aggregated to show outcomes for groups and programs. At Community Data Roundtable we use many of these reports in both our online CANS app, the DataPool, as well as in the web-based dashboards we build for communities in which we are active. These reports are clear, accurate, and helpful — and they are helping us at CDR build collaborative insight and action in the communities where we work.
But already, these reports are being innovated upon, and we are pushing towards the next level. During the recent Communimetric Data Quarterly Roundtable that CDR hosts, we heard two presentations of groups who are building off of these original reports, to create new and innovative ways to measure a CANS “bottom line.”
First, Jesse Troy Ph.D. of CDR shared how we have been developing a Needs Improved to Needs Worsening ratio to identify outcomes. This complicated model involves two ratios on top of each other – the percentage of needs that improved, over those that got worse. In building the model we decided we didn’t want to use all of the CANS items on our tool, because that creates noise of all of these variables that may not be central to what a service actually treats. Thus, to identify the items we should use, we used an original technique from the Praed Reports Suite, called MultiLevel Collaborative Formulation, in which the most salient treatment needs are identified using a certain reduction model. As such, our new outcomes metric is building off of the MLCF and trying to take it to new levels.
Likewise, Allison Krompf of Northwestern Counseling and Support Services in Vermont shared how she has developed a “CANS Severity Score” modeled off the Praed Reports “Support Intensity” approach, which adds all the 2s and 3s together (counting the 2s and 3s, in order to give weight appropriately). NCSS is doing many innovative things with this metric, including using it to derive a CANS value metric, which shows the change in severity over the per member per month costs of their population, to show the value of the change that CANS can measure.
These are just two examples of the ways that the original Praed Report suite is being incorporated, and expanded upon, by independent professionals joining the collaborative. We can only imagine what next we will find.
For those interested to hear the presentations on these metrics, please contact Community Data Roundtable at email@example.com.
Thank you Dan Warner for sharing your reflections!