A lesson on the bottom line

Bottom-Line-ButtonThe AACSB peer review team left this morning after a whirlwind onsite accreditation visit to our business school. I shared a copy of our report with you a couple of months ago. Today I want to share a lesson on bottom line messages.

Some background. One area in which the College is reviewed is called Assurance of Learning (AOL). Also sometimes referred to as assessment. Reporting on our AOL activities allows us to document that we know what our students learn in our degree programs and that we continuously improve those programs to enhance student learning. In our accreditation report, we followed a prescribed arrangement of materials with AOL information contained within one section of the body and within the appendices in summary tables. Here’s an example of one of those tables for our undergraduate program (with around 6,500 students currently enrolled).

AOL TableThere are four basic types of information in this table identified with red circles for you (but not in the original document).

  1. The portion of the table devoted to Learning Goals describes the target knowledge or skills for students in a specific degree program. (There are five for our bachelor’s degree.)
  2. The entries for Assessment Tools describe how the learning goals are measured. (Two of these are writing assignments from required courses, which were collected from hundreds of students during spring of 2011 and scored using a faculty-developed rubric by independent raters.)
  3. The content in the Results section describes performance of students on the chosen measures. (For instance, although our students met the target of 80% meeting expectations for professional writing, there was a significant decrease in the percentage who exceeded expectations compared with a couple of years ago.)
  4. The bottom of the table is devoted to Resulting Improvement Initiatives in three categories: curricular changes, assessment procedure changes, and co-curricular changes. (One of the curricular changes recommended in this case was to hire more full-time faculty instead of relying on graduate students to teach professional writing.)

Back to my lesson on the bottom line.

Here’s what happened when our reviewers read our report: They recognized that we had appropriate learning goals and were collecting measures of that learning. However, they questioned whether we were actually using those measurements to drive improvement in our programs. (In the assessment world, this is called “closing the loop” and is obviously of critical importance.) Luckily, when our reviewers visited with our faculty and asked about continuous improvement, we were able to describe examples from lots of degree programs, including the bachelor’s degree.

So what went wrong? As the writer, I failed to understand what the bottom line message of these summary tables was for my readers.  If I had recognized the improvement initiatives as the bottom line (rather than thinking the bottom line was about the appearance of all four aspects of the process), I would have highlighted the improvement initiatives.  How? What is first (or biggest or loudest) has salience. (I’ve written about the psychology behind perception in information design before.) So I could have used some change in typeface or size or color to make this section of the table more salient. Or I could have moved this information out of the table and made it more prominent by discussing it first. Etc.

Here’s what I think is important about this lesson. It’s another reminder that the bottom line message is critical. And that it’s not the same for every reader. In addition, it shows that soliciting comments from readers on drafts (ours saw two versions of the report before it was officially submitted) doesn’t guarantee success. Despite the fact that our readers praised the quality of our report, this bottom line message wasn’t clear to them. So perhaps the critical lesson is about keeping open the possibility of conversation to supplement any written documentation even in — or especially in — a bureaucracy.

Similar Posts


  1. Hi, Kim…

    This post really resonated with me. I do a lot of proposals for clients hoping to do business with various levels of government. Many times, the “bottom line” response is not clear, because the RFP request is not written clearly. Perhaps this was true in your case. Had the instructions for the AOL report been more clear, that is, had they told you that you needed to address how measurement was used, you would have done so. It would be interesting to explore in a subsequent post how one goes about identifying the “bottom line” response.

    1. Hi, Debbie! You’re correct that it would be easier for the writer if the guidance for writing up the AOL material were revised. I suspect part of the issue here is that individual reviewers have different preferences for what counts as the bottom line.

      I have been thinking for years about research on bottom line identification. Now that accreditation activities will slow down, I might just do some thinking in this direction.

  2. So document cycling within the college didn’t work as well as you wanted. I wonder if you would have shown the document to a group of people in another college or even outsiders (folks not in the academic world) if they would have caught the need for a stronger BL about changes to programs and courses?

    1. This is an interesting thought. Part of me suspects non-academics, especially those who have been involved in continuous improvement in industry, would have helped me realize the need to highlight improvement efforts based on data collection.

      I should say that the reality is even worse than I communicated in this post. The document was actually reviewed TWICE by the chair of our official peer review team — one of the people who questioned whether we were closing the loop on assessment. In hindsight, I think I should have asked the other team members to review drafts of the document.

Leave a Reply