TCOM CONVERSATIONS

by: Evelyn Kintner and Max Kintner, Northern Rivers

The October 2017 TCOM Conference provided an incredible opportunity for Max and I to present with Stephen Shimshock and Yvonne Humenay Roberts from Casey Family Programs. Although first intimidated by our illustrious co-presenters, it took all of 20 minutes over lunch before Max and I quickly formed a bond with Stephen and Yvonne.   Combining our presentations was an easy affair since we shared a common focus on the two individuals in the therapeutic exchange arena: the practitioner who oversees care provision and serves as the representative of the agency in interactions with the family/youth; and the client who should be both a partner in the decision making processes related to their care as well as a recipient of services.  Like Casey Family Programs, our efforts to improve quality of care at Northern Rivers Family of Services (NRFS) revolves around the engagement of practitioners in an open dialogue about data and how analytics can be used to improve one’s practice.  Our combined presentation demonstrated how that might be done for a mid-size organization such as Northern Rivers and a large multi-site organization such as Casey.

How do you implement a Communimetrics based evaluation model in a mid-size organization?

It has been exactly five years ago this month that I was handed some data and asked by Executive Leadership to “See how much you can get out of the CANS as far as analytics.”  Not even 2 years out of ivory towers of academia, I ran the data and produced a highly technical report that fully demonstrated the utility of CANS data for risk profiling and outcome evaluation.  Unfortunately (for me), however, while the report proved the tool was capable of supporting “predictive analytics,” I quickly learned that at the time there was indeed a small audience in Northern Rivers that understood (or cared about) how metrics could inform the provision of care.  Luckily, I had the support of two key individuals: first, the Chief of Quality Management, who as my supervisor, encouraged me to find venues through which to introduce data and analytics to a doubting service provider base; and second, a colleague (and brother) who as one of the doubting practitioners in one of the NRFS affiliates, challenged me to rethink evaluation in terms of its impact on the “practice of child welfare” and its service providers.  It was through the purposive establishment of an open dialogue about TCOM and the role of data with practitioners and programs that progress was made.

So what worked at Northern Rivers?

Although we are still in the implementation phase, there were 3 strategies in retrospect that have proven to be very effective in moving forward a metrics based evaluation agenda at Northern Rivers.  These were/are:

  1. Getting engagement of Executive Management, by creating opportunities to use data to further the agency’s goals. For NRFS, the first chance to demonstrate the utility of a Communimetrics assessment tool came shortly after releasing the  unread technical report when a personnel change opened up an opportunity for performing utilization review (UR), chart auditing, AND retrospective coding of admission records using the CSPI (Childhood Severity of Psychiatric Illness—the parent of the CANS).  The risk profiles that emerged from the retrospective coding piqued curiosity, leading to a request by the ELT to do a full risk assessment of youth admitted to our Office of Mental Health (OMH) Day Treatment and Residential Programs.  There were many benefits derived from the “UR Project,” including the use of the data to support an application for a capital building grant for ER/Hospital diversion that resulted in a grant award of over $4 million in funding.  CANS and FAST data are now commonly used by the ELT members when advocating for the agency with potential donors, regulators, and state policy makers.
  2. Partnering with your Practitioners by giving them the data and guidance/training they need to do their jobs better today—not tomorrow. In retrospect, the most difficult task in implementing a metrics based model of care has been in gaining the consensus of the service provision staff that metrics are a “good thing” that can improve their practice skills—and as a result, improve the quality of care being provided.  For Northern Rivers, the first foray into practitioner conversion was done as a request from a director who wanted coding guidelines related to “masking effects” and “the 30 day rule.”  The response was the development by QM of a technical manual and training curriculum for ensuring fidelity to TCOM principles.  Although sections of the manual/training have been very well received, practitioners tend to see their practice as an “art” not “science,” thus viewing some guidelines as “too impersonal, too intrusive, and too scripted” on how to craft CANS measureable goals using anchor descriptions.  (My brother has coined the term “Paint by Number care” to capture this resistance.)  Recently, however, the aversion to metrics has somewhat subsided as the agency and program staff have become more successful in using assessment data to document client progress and/or advocate its case with regulators/policy makers.    No doubt, it has also helped that New York State is transitioning its Medicaid youth from a fee-for-service model of care into managed care.  Metrics are no longer seen as “optional” by anyone.
  3. Formalizing expectations in an Evaluation Plan that incorporates TCOM principles. To be effective, it is imperative that all levels of staff from direct care workers to executive leadership clearly understand expectations relating to the conduct of assessments and use of data.  If expectations are not integrated into a formal plan, organization drift will occur and CQI efforts stall.  Formalization performs 2 key functions: it clarifies what the agency considers important and makes concrete the framework through which program efforts are evaluated.

Evaluation activities at Northern Rivers hinge around 4 core TCOM constructs:

  1. Fidelity to TCOM principles, measuring the capacity of programs, management, and direct care staff to utilize the CANS to inform care decisions;
  2. Workforce competencies, measuring the proficiency of: i) Workers to assess need and develop goals using the CANS; ii) Supervisors in monitoring fidelity and caseloads; and iii) Management to inform CQI efforts and business decision making;
  3. Compliance & Care Management (i.e., Managed Care Readiness) using the CANS to document medical necessity, manage workloads, and monitor network needs; and
  4. Impact evaluation to assess client, worker, and program outcomes.

As demonstrated in the presentation, NRFS has successfully utilized these core constructs to guide its analytic and program evaluation efforts.

In closing, Max and I wish to thank those that attended the workshop we co-conducted with Stephen and Yvonne.  Relationships and the learning process continue to mature through ongoing collaboration with attendees.  We look forward to next year’s conference in Chicago!

Max Kintner, PhD

HCI Program Coordinator/Bridges to Health

Northeast Parent & Child Society
1580 Columbia Turnpike | Castleton, NY 12033
max.kintner@neparentchild.org

________________________________________

Eve Kintner, MA, MSW/PhD

Director of Performance Management

Northern Rivers Family of Services
60 Academy Road | Albany, NY 12208
evelyn.kintner@northernrivers.org

One Response

  1. This article really demonstrates some of the truths of successful implementation. While the Reliability testing that is done yearly is the “sine qua non” of implementation, a locally developed (and monitored) evaluation plan gets that dat-to-day organizational buy in (and attention) that I have also found critical for successful implementation. I really liked your 4 principles for self-evaluation, as well — well distributed to important components, yet focused enough to be done.

Leave a Reply

Back to Top
%d bloggers like this: