Archive: reports

Blog: Four Warning Signs of a Dusty Shelf Report

Posted on November 11, 2020 by  in Blog

Data Visualization Designer, Depict Data Studio, LLC

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Four Warning Signs of a Dusty Shelf Report

When was the last time your report influenced real-life decision-making?

I used to write lengthy reports filled with statistical jargon. Important information sat around and gathered dust.

Now I design reports people actually want to read. Fewer paragraphs. More visuals. My audience can understand the information, so the data actually gets used.

What Reports Are Supposed to Do Every Single Time

Maybe a policy maker voted differently after reading your report…

Maybe your board of directors changed your programming after reading your report…

Maybe your supervisor adjusted your budget or staffing based on your findings…

Maybe your stakeholder group formed a task force to fix the issues brought up by your report…

We’ve all had successes here and there.

But does this happen every single time?

Four Red Flags to Watch For

A dusty shelf report is a report that people refuse to read. Or they glance at it once, don’t read it all the way through, and then repurpose it as a dust collector. Here are four signs that you’ve got a dusty shelf report on your hands. (Or a dusty dashboard, dusty infographic, or dusty slideshow. Watch for these red flags with all dissemination formats.)

1.  No Response

You email your report to the recipient. Or you post it on your website.

You don’t get any response. The silence is deafening.

2.   A Promise to Follow Up Later

You email your report to the recipient. They respond!

But the response is, “Thanks. We received the report. We’ll follow up later if we have any questions.”

This is not use! This is not engagement! We can do better.

3.  “Compliments”

You email your report to the recipient. They respond!

But the response is, “Thanks. We received the report. We’ll follow up later if we have any questions. I can tell that a really technical team worked on this report.

Yikes… that “compliment” is a red flag.

I used to hear this a lot. I thought, “Wow, they must’ve checked out our LinkedIn profiles! They can tell that our entire team has master’s degrees and Ph.D.s! They know we speak at national conferences!”

Later, I realized the reader was (kindly) mentioning our statistical jargon.

Watch for this one. It’s a red flag in disguise.

4.  Won’t Read It

The recipients flat-out say, “We’re not going to read this.”

Sometimes, this red flag is expressed as a request for another format:

“Do you happen to have an infographic?” Red flag.

“Do you happen to have a slideshow?” Red flag.

I’ve seen this with several government agencies over the past couple of years. They explicitly require a two-pager in addition to the technical report. They recognize that the traditional format doesn’t meet their needs.

How to Transform Your Reports

If you’ve experienced any of the red flags, you’ve got a dusty shelf report on your hands.

But there’s hope! Dusty shelf reports aren’t inevitable.

Want to start transforming your reports? Try the the 30-3-1 approach to reporting, use data storytelling, or get better at translating technical information for non-technical audiences. Our courses on these and other data visualization topics will help you soar beyond the dusty shelf report

Blog: How Your Editor Is a Lot Like an Evaluator

Posted on January 22, 2020 by  in Blog

Editor and Project Manager, Dragonfly Editorial

Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

I’m Cynthia Williams, editor and project manager at Dragonfly Editorial and owner of Style Sheets Editorial Services. Having worked with lots of program evaluation and research consultant clients, I’ve seen how they help programs evaluate the quality of their offerings. In this blog, I’d like to show you how a good editor can act like an evaluator — for your publications.

We conduct needs assessments. Context is everything, and we want to make sure we’re sufficiently supporting your team. So if you tell us to focus on something specific in your reports — or not to mind, say, the capitalization of “Program Officer,” because that’s how the client likes it — we pay attention. Similarly, if the client wants a more muted tone to avoid too much bluster in the reporting of results, we’ll scan for that too. If your organization has a style guide, our editing is also informed by those requirements. By communicating with you about the right level of edit, we can avoid editing too lightly or too heavily.

We’re responsive to context. Further on context, we make sure to edit according to audience. If you’re reaching out to other experts in your field, we don’t query excessive jargon and terms of art — you’re talking to peers who know this stuff. But if you’re translating your research to lay readers (who may be educated but not versed in your area of expertise), we’ll add a comment if we come across phrasing or terms that make us, mostly editing generalists, do a double take. The thinking is, if we have to read that sentence more than once, so will the reader of your report.

Editors also bring industry standards to the table. Just as evaluators have the American Evaluation Association’s Guiding Principles For Evaluators, copy editors have an arsenal of guiding principles. We refer to style guides, such as The Chicago Manual of Style and the Publication Manual of the American Psychological Association. We employ usage manuals, such as Garner’s Modern English Usage and Merriam-Webster’s Dictionary of English Usage — and, of course, online dictionaries, encyclopedias, and grammar guides.

We use mixed methods. In addition to the above references, editors also use a more qualitative tool — that is, the editor’s ear. This practice is honed over years of reading enough similar materials to know industry norms and being versed in editing for readability and plain language.

Like evaluation, editing is a participatory process, a conversation between your organization and your eagle-eyed publication caretaker. The best results require open communication about each manuscript’s needs and audience, and flexibility from all parties to reach a high-quality final product.