The Revenue Impact of CDI and Coding Mismatches and Physician Query Analysis

By Karen G. Youmans, MPA, RHIA, CCS

Collaboration between clinical documentation integrity (CDI) specialists and health information management (HIM) coding professionals is key to a successful CDI program as well as the evaluation of the revenue impact on the healthcare facility. When a CDI specialist’s final MS-DRG (or APR-DRG) does not agree with the coder’s final MS-DRG (or APR-DRG), then there is a “mismatch.” Many organizations struggle with finding a balance between these two teams, but with collaboration, communication, and education, there can be a significant revenue impact by performing mismatch reviews on a pre-bill concurrent basis.

Organizations need the appropriate data to accurately validate their CDI program as well as to account for their coder’s post-discharge pre-bill queries. This challenge goes hand in hand with trying to determine the financial impact of the concurrent CDI program, bill holds for the coder’s post-discharge pre-bill queries, and alignment with the healthcare facility’s goals. An effective analysis of data trends and statistics surrounding both CDI and coding mismatches and the effectiveness of their physician query process is essential in determining the positive effect on the organization’s quality documentation and financial impact.

In order to identify the revenue impact of a CDI and coding mismatch program as well as physician queries, key data must be analyzed including trends in MS-DRGs, principal diagnoses (PDx), complications and comorbidities (CC), major complications and comorbidities (MCC), and ICD-10-PCS code assignments. In addition, analyzing the coder’s post-discharge pre-bill queries, physician query compliance, and turn-around time for answered queries is needed. These data elements provide deeper insights and assist in an accurate validation of a CDI and coding mismatch program and its impact on the bottom line as well as the analysis of physician queries themselves.

Examples of key findings from CDI and coding mismatch reviews as well as physician query outcomes are provided below. Each example has pre-formulated questions that will assist decision makers in finding insights towards continuous improvement and the program’s validation.

Example #1: CDI vs. Coder DRG Mismatch Summary Report

When the CDI specialist’s and coder’s final DRG does not match, then the account is placed on hold for an auditor (internal or external) to make the final DRG determination.

CDI vs. Coder DRG Mismatch

Q1 2020 Summary Report

 
Reviewer FindingsTotal Cases% of TotalDRG $ (estimated)
Agreed with Coder DRG3,80074.51%$7,600,000
Agreed with CDI DRG80015.69%$2,400,000
Neither – different DRG5009.80%$1,250,000
Total5,100100.00%$11,250,000

In analyzing the above CDI vs. coder DRG mismatch summary report, the organization should:

  • Examine the reasons why the reviewer agreed with coding/disagreed with CDI. Is there additional ICD-10-CM/PCS coding training or official coding guideline education needed for the CDI team?
  • Evaluate the specific MS-DRGs where the reviewer agreed with CDI and/or did not agree with either (neither category). Is there an opportunity for additional clinical training for the coders? Or additional queries required? Or re-education on official coding guidelines?
Example #2: CDI Query Outcomes Summary
CDI Queries MS-DRG Changes Q1 2020 
CDI Query Outcomes# Queries% of Queries 

DRG $ (estimated gain)

PDx Changed59832.08%$1,794,000
CC Added18710.03%$504,900
MCC Added1588.48%$489,800
PCS Charged331.77%$95,700
No change88847.64%$0
Total1,864100.00%$2,884,400

In analyzing this CDI query outcomes summary report, the organization should:

  • Assess the reasons why there was no change in the MS-DRG 47.64 percent of the time. There could be many reasons for nonfinancial impact including queries for PSI, HAC, or SOI/ROM. Or is there additional clinical training or coding guideline education needed for the CDI team regarding writing an appropriate query? Were there any specific MS-DRGs in this grouping category that need further analysis? Was there a specialty group that was more involved in the queries (e.g., cardiology, neurology, internal medicine, general surgery, etc.)? Was there a specific physician more involved in the queries (e.g., not answering queries in a timely manner)?
  • Identify specific principal diagnoses changes. Were there any specific MS-DRGs in this grouping category to take a deeper dive? Is there documentation training needed for specific specialty groups (e.g., hospitalists, cardiology, nephrology, internal medicine, etc.)?
  • Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders or additional documentation education to physicians.
Example #3: MS-DRG Query Distribution – Original MS-DRG
Date Range: 10/01/18 – 09/30/19
Original DRGTotal QueriesTotal ImpactAverage Impact
690147$284,666.65$1,936.51
871129$97,344.01$754.60
193108$263,273.45$2,437.72
689106$462,376.91$4,362.05
60398$269,014.43$2,745.05
19098$143,982.52$1,469.21
39289$160,315.17$1,801.29
87286$312,900.75$3,638.38
19477$291,647.73$3,787.63
18970$173,337.29$2,476.25
29253$137,784.89$2,599.71
19151$76,905.98$1,507.96
68248$47,529.09$990.19
29148$123,880.42$2,580.84
19247$139,707.27$2,972.50
37847$159,825.63$3,400.55
68345$98,068.01$2,179.29
37945$102,333.44$2,274.08
31341$106,410.95$2,595.39
94840$49,365.33$1,234.13

In analyzing this original MS-DRG query distribution summary report, the organization should:

  • Analyze the top original MS-DRGs. Is there additional clinical training or coding guideline education needed for the CDI or coding teams regarding specialties? Was there a specialty group that was more involved in the queries (e.g., urology, internal medicine, hospitalists, etc.)? Is there a need for a more focused study on a clinical topic (e.g., sepsis)?
  • Identify specific MS-DRG triplets/pairs for further analysis (e.g., 291/292, 689/690).
  • Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders, or additional documentation education to physicians.
Example #4: MS-DRG Query Distribution – Revised MS-DRG
Date Range: 10/01/18 – 09/30/19
Revised DRGTotal QueriesTotal ImpactAverage Impact
871449$1,178,830.96$2,625.46
872208$111,412.30$535.64
698116$190,472.30$1,642.00
291113$280,999.21$2,486.72
853109$1,411,966.59$12,953.82
18976$87,314.51$1,148.88
69971$102,258.34$1,440.26
17769$233,081.30$3,377.99
19362$49,833.21$803.76
28057$153,328.68$2,689.98
19450$9,828.73$196.57
85447$175,227.36$3,728.24
19046$62,180.20$1,351.74
37842$65,448.44$1,558.30
39239-$19,885.57-$509.89
19138-$10,740.03-$282.63
31437$81,000.69$2,189.21
68336-$22,024.35-$611.79
91734-$48,360.72-$1,422.37
68234$101,379.09$2,981.74

In analyzing this revised MS-DRG query distribution summary report, the organization should:

  • Analyze the top revised finalized MS-DRGs after query. Is there a need for a more focused study on a clinical topic (e.g., sepsis)? Is there an added opportunity for additional clinical training for the CDI or coding teams, or additional documentation education for physicians?
  • Identify specific MS-DRG triplets/pairs for further analysis (e.g., 871/872, 682/683)
Example #5: Post-Discharge Pre-Bill Query Summary
CODER QUERY RESULTS
First Quarter 2020
FacilityTotal QueriesAgreed% Agreed* Dollars Gained Canceled or Not Answered% Canceled or Not answered Potential Dollars Lost Average Days to Respond
A403997.50% $ 32,60012.50% $ 600 4.08
B 29126189.69% $ 223,5003010.31% $ 6,903 4.93
C625283.87% $ 108,4711016.13% $ 2,297 5.08
D47239984.53% $ 404,7267315.47% $ 124,341 6.66
E1028381.37% $ 65,8941918.63% $ 11,224 7.26
F 39837493.97% $ 412,662246.03% $ 26,896 4.30
G18317997.81% $ 152,23542.19% $ 5,320 3.76
H44142195.46% $ 508,316204.54% $ 8,073 3.21
I11310996.46% $ 195,26043.54% $ 1,400 3.61
J16716196.41% $ 176,24463.59% $ 2,100 2.99
K262284.62% $ 19,432415.38% $ 1,349 3.73
L24221187.19% $ 223,0663112.81% $ 23,262 4.67
Total2,5372,31191.09% $ 2,522,404 2268.91% $ 213,766 4.63
*Agreed = Physician answered and $ changed; Physician answered but no $ change (i.e. POA or undetermined)

In analyzing this coding professional’s post-discharge pre-bill query summary report, the organization should:

  • Examine the reasons for the volume of the coder’s post-discharge pre-bill queries. Was the coder providing additional queries that the CDI team did not originally present during the patient’s hospitalization, including items such as additional documentation post-discharge (e.g., discharge summary, pathology reports, etc.)? Or was the coder carrying on the initial CDI query post-discharge?
  • Drill down to specific hospitals and specific physicians who answer the query appropriately and those that go unanswered. Is there additional clinical training needed for the coding team regarding writing an appropriate query? Was there a specialty group or individual physicians that need coaching from a physician advisor? Is there a process improvement plan to be implemented regarding the average days to respond per facility or per specialty group?
Example #6: Query Response Time
Physician Response Time to Queries (days)
Time Frame20182019Change (%)
120 Bed (Hospital)14.945.869.08 (61%)
465 Bed (Hospital)4.854.130.72 (15%)
2460 Bed (Healthcare System)6.575.041.53 (23%)

In analyzing this physician response time to queries summary report, the organization should:

  • Examine the reasons for the 61 percent change in physician response time in the 120-bed hospital. Was this due to a positive change in processes? Was additional training provided to all coding professionals, CDI staff, and/or physicians/physician advisors? Was an electronic system implemented?
  • Analyze the impact of the dollar bill hold for each day the physician does not respond to the concurrent or pre-bill query.

These examples, statistics, and trends can be used to identify missed opportunities to enhance a CDI and coding DRG mismatch program, prove the positive effect on your organization’s quality documentation, create process improvement plans for physician queries, and measure financial impact.

References

AHIMA. “Guidelines for Achieving a Compliant Query Practice.” Journal of AHIMA 84, no.2 (February 2013): 50-53.

Robinson, Steven. “Bridging the Gap between HIM Coding and CDI Professionals.” Journal of AHIMA. April 26, 2017. https://journal.ahima.org/bridging-the-gap-between-him-coding-and-cdi-professionals/.

Wieczorek, Michelle M. and Cheryl Ericson. “Importance of DRG Reconciliation in the CDI and Coding Processes.” ICD-10Monitor. March 2, 2017. https://www.icd10monitor.com/importance-of-drg-reconciliation-in-the-cdi-and-coding-processes#:~:text=DRG%20reconciliation%20was%20a%20way,those%20without%20a%20coding%20background.

Karen Youmans ([email protected]) is president at YES HIM Consulting, Inc.

Leave a commentSyndicated from https://journal.ahima.org/the-revenue-impact-of-cdi-and-coding-mismatches-and-physician-query-analysis/

Translate »
%d bloggers like this: