Creating a content development system that brings in stakeholders

A different version of this material was presented at Confab Higher Ed in November 2015. See slides from the Confab presentation.

The problem

The small in-house team at The Evergreen State College wanted a process for aligning the content of the website with a new focus on prospective students as part of a redesign. How could we handle that work in sustainable phases while including subject matter experts, stakeholders, and the staff responsible for ongoing maintenance?

A unique opportunity

  • Phased-in redesign: the way we managed the visual redesign in our CMS allowed us to approach stakeholders with something they wanted (the new look) while requiring them to work with us to get it.
  • Renewed focus: our team had independently determined that the primary audience of the college website needed to be prospective students. As a group we presented to senior staff and received the approval to move forward on that assumption.
  • Team flexibility: I had been following content strategy trends online, and while our CMS was in stasis the web manager allowed me to take the lead on this work.

An evolving process for content

Scoping the work and iterating over content

I began with a spreadsheet of all the site’s sections; together our team looked at our prior research and other college marketing to determine which sections were relevant to prospective students, then rated them using a 1-5 system, where 1 was most important and 5 was least important. If a section was particularly important to current students, that also rated it higher as “support research” for prospective students. For sections with little or no application to prospective or current students, we indicated its core audience: staff, faculty, donors, alumni, legislators, etc. 

This gave us an order to approach stakeholders involved in each section, starting with the 1s and working to the 5s. We did work on some sections that were not for prospective students if there were other college priorities involved, or if we had a special opportunity that might not happen again. We sometimes had to adjust our timeframes to make sure we got work done before a particular high-traffic deadline or to avoid working during a high stress period for the section’s staff. It was a process of negotiation, mostly between the Web Manager and the manager of that area.

We tried to always include a good mix of staff in the work group: someone with approval authority, knowledgeable stakeholders, and the section’s primary editor if possible. It helped to have people who knew the audience directly as well as people who were connected to the bigger picture.

Discovery and research

I led the workgroup through a discovery exercise where we discussed the audiences for the section’s content. I asked a lot of questions, especially when people were overly general or listed too many audiences, to get at the core people who needed this information. I also had them talk about the actually frequently asked questions, sometimes probing to see if there were people they didn’t encounter directly who might need information. And I asked about what they really wanted people to be able to do or understand after visiting the site. Throughout the process, I did “live” notetaking, to underscore that this was a fluid and open process. It built a lot of trust and allowed staff to clarify their thoughts and feel that they were being understood.

Meanwhile, the designer on our team was the person most familiar with our Google Analytics, and he would pull together reports of most visited pages, entry points, search terms, etc. We combined that with an audit of pages to see what content was currently being used. This gave us more topics for discussion, and in some cases pointed to immediate gaps in knowledge.

If it wasn’t already clear where the content needed development, we then organized usability testing with tasks based on the discovery process. We used the techniques in Steve Krug’s books; the designer managed recruiting of participants and I ran the actual sessions. We already had a monthly usability testing practice, so it was straightforward to work these into our schedule.

Between the analytics, audit, and usability, we were left with a pretty clear outline of what content needed to be created or updated, and how navigation needed to be improved to help people get through that content. Occasionally, we would also discover opportunities for improvement in the site as a whole or in other areas of the site, and add these to our work as time allowed.

Uncovering and organizing the content

I took the topics of greatest interest and business need and used that as an outline for developing content. We had regular group meetings; the agenda would reflect the content under review, with only those with knowledge and interest in that specific topic needing to be present. I used the same note-taking style as in the initial discovery phase and asked extensive questions with the goal of getting enough understanding to be able to write clearly and conversationally. 

I discovered that this process helped us get more consistent language throughout the site, which was not in our original goals. Because I spent time encouraging people to use the mindset of  explaining processes to a real person, and not having them do any writing to start with, they gave instruction and information that flowed naturally. So the writing started out closer to a “plain talk” style, which was easier to keep through revisions later.

At the same time, as we talked through subjects, the designer and I could see the relationships and connections between them, which helped us to collaborate on section navigation and section main page design. 

As each piece of content was written, we put it into the new site design with the navigation as it evolved, and the team could provide edits and other feedback. At each meeting, we would assess whether the new content was sufficient to replace the existing section: if at any point it was better than what already existed, we would go live even if the work was not entirely complete. The goal was always to provide site visitors with the best possible content at every point as we continued working.

Assessment and iteration

As a project wrapped up, we turned to assessing the work. Sometimes, this took the form of formal metrics: more completions of a campus visit form, for example. Sometimes it meant looking at analytics to see if pages had more views. Often we did a follow-up usability study to see if the tasks we’d established originally could be completed using the new content and navigation. If we found the need for changes, we would continue to adjust even while moving to a new section. 

Less formal but still valuable was the regular feedback from frontline staff, who could let us know if those annoying questions were being answered by the site instead of them. These reports sometimes came with really meaningful stories about how our content had helped a student or an applicant to be more successful.

With each project, we learned something about the process that we could apply to the next one: a new way of asking a question, an understanding of who to include, a piece of research that might apply to a different section’s content.

This process lasted for three to four years, during which we reviewed and updated [go count ‘em] sections. Our focus changed when we migrated to Drupal as our CMS in 2016/17, which was an incredible step up for us technically but much more demanding of my time in particular. While we briefly added a full-time content strategist, that position was lost to budget cuts within a year, and our team was never able to get back to this work consistently.

[rewrite this last paragraph] However, some of that content still exists, and the process built relationships that lasted well past the process, giving our team advocates for other web work when we needed it.