cropped image of presentation room at DEL2018

Digital Presence and Public Scholarship: Empowering Graduate Students as Professionals

Yesterday my colleague Kristen Mapes and I presented some of our work around Digital Presence and Public Scholarship at #Del18. Our presentation, titled “Digital Presence and Public Scholarship: Empowering Graduate Students as Professionals” gave an overview of our work in general, but more specifically focused on our work to integrate digital presence work with graduate students as they move into careers as professional scholars.

You can view Kristen’s post about the presentation, check out our presentation on Google Slides, or contribute comments in our open paper draft.

As a follow-up to a question that was asked during our session about how do we work with people in our workshops who come to the workshop simply seeking to be hand-held through the technology rather than doing the active skill development and learning that we expect.

I think a partial answer to that is that we are working to build a culture of people doing this digital presence work where they can rely on each other as the expert and not have to feel that they need to rely on us as “technical experts.” Our goals are to have participants engage with us as colleagues who are, while perhaps further along, on the journey of crafting and curating digital presence ourselves. This culture change is really at the heart of making this a long-term and sustainable initiative, it’s how we recruit other collaborators and co-facilitators for the initiative, and how the work spreads through departments, units or other organizations on campus. It is a shared culture of creating, maintaining and sharing.

Methodologically, the roots of this for me are really in my background as a writing consultant where the goal of a writing consultant is not to write the paper for the client, but rather to coach them through some of the issues that are found in the paper. Or, perhaps it’s rooted in some of the background of peer coaching that often accompanies leadership training. When peer coaching, your job is not to solve the problem for your colleague but rather to provide a sounding board and opportunity for them to come to a conclusion or solution themselves. By practicing in this way we move toward the culture change we desire, allowing us to continuously innovate and move toward higher levels or more complex instances of digital presence.

AAN Panel on Learning Assessment

On Friday 2/16 I was part of a panel sponsored by the Academic Advancement Network on Learning Assessment. Fellow panelists Stephen Thomas, Justin Bruner, Becky Matz and I were asked the following questions with conversation among us and the group following.

The word “assessment” means different things to different people and roles. What does learning assessment mean to you?

For me assessment is about understanding what the student has learned, or how they’ve moved the needle so to speak in terms of advancing their thinking or understanding of an issue. In our work with credentialing we really need to understand how the student is making connections of these kind of seemingly disparate events or activities that they’re doing in order to show that the sum is greater than the parts.

How do you talk to people you work with about assessment? At the college or program level, how does that translate into helping faculty and colleagues understand what you’re trying to achieve or assess?

We really start by talking about goals and objectives and understanding what it really is that we want to know. You can’t assess anything unless you know what you want to assess and too often when we start thinking about assessment we haven’t considered what our initial goals were and so we have no real way to assess.

Stephen’s comment about classroom assessment and program assessment was a really good point, one that shows that we often think about assessment as on a programmatic level and mostly with a negative connotation. Michael Lockett pointed out as well that in post-secondary education we often conflate assessment and evaluation, and that if we were to think about it in a way like K12 or other areas where evaluation happens by combining numerous assessments.

How do you see technology impacting your assessment practices? What does that mean for you?

As always, technology should be used in support of the work that we are doing or as a lever to make things easier or more efficient. Sometimes it can make things more effective, but all too often the assumption is that the technology makes things more effective without some real planning about it.

For me the technology is allowing me to better show the goals and objectives that we’ve set out for the students ahead of time so so they are wandering around in a “fog”. The achievement system that we’ve been working with Brian Adams, Nate Lounds and others has been a great way of showing how the technology can help the students to understand where they are in a program so that they can best participate and choose ways to participate.

How do you see assessment impacting teaching and learning practices in the classroom? Do you see an impact?

When we are better able to assess learning accurately, we are better able to make adjustments to our teaching practices that will ideally yield better learning. When we can understand what students are learning better, we can best scaffold and teach more or present more learning opportunity for the students and help them figure out how to support their own learning. This is really where I see and advantage for us if we’re thinking about these things early on in a student’s career so they aren’t spending the majority of their student career here not really understanding why, what and how they need could be learning things.

Mouse, keyboard and notebook

Faculty Working Group on Digital Presence and Public Scholarship

This fall I’m co-leading a faculty working group focused on digital presence and public scholarship. Our working group is a collaboration with the Colleges of Arts and Letters, Education, and Natural Science, and brings together about 20 faculty and academic staff from those areas. While the technical solutions provided through services such as Reclaim Hosting’s Domain of One’s Own and social media services such as Twitter and Medium are at the core of a lot of the work we are doing, they are not the focus. Instead, our focus is on critically thinking about one’s presence online and how that can work to amplify their scholarly work and engage others.

At the College of Arts and Letters, our goal for the larger initiative this working group is a part of is to:

Promote and support efforts by faculty and graduate students in the College of Arts and Letters to critically think about, create, and maintain a robust digital presence that amplifies and enriches their scholarship and enables them to engage the broader public.

We got off to a great start last week with an MSU Academic Advancement Network sponsored event led by Dean Christopher Long titled “Cultivating an Online Scholarly Presence” which featured Dean Long and several faculty members from around the University sharing their digital presence and discussing how they use it in their work. Then we followed up with the first of our weekly co-working sessions where we bring together faculty participants for an hour of conversation, questions, and work on their digital presence. We used the first sessions to focus on mapping out the pieces of their digital presence and identifying the core elements.

I heard some great questions/comments through our sessions and have shared a few below. These are great examples of the critical nature of the work we are doing.

  • How do I create workflows so things are easy to create and update.
  • Does working with website templates make it easier to achieve compliance with accessibility?
  • How often should one you update content?
  • How would using Twitter help me increase the reach of my scholarship?
  • How might I leverage the brand of the institution?

I’m looking forward to what the coming weeks have in store!


tools in a toolbox

A Quick Look at Tools I Use for Web Accessibility Work

A couple weeks ago I had an opportunity to present at the Michigan Association for Disability and Higher Education (MI-AHEAD) conference with my colleagues Nate Evans, Phil Deaton and James Jackson. We spoke about the great work Michigan State University is doing to promote accessible content in electronic information technologies (EIT), in particular our strategies around bringing awareness of accessibility into people’s workflow.

During a side conversation we were having I pulled up a website I’ve been working so they could weigh in on a question of color contrast. As we worked I realized that outlining some of the free tools I use on a daily basis for assessing website accessibility might be useful to others as well.

There are four tools I use regularly in my workflow. Below I’ve listed each with a brief description of how and why I use them. I use the browser-based tools with Google Chrome, but many of them work with other browsers too. One last note is that none of these tools are complete solutions in themselves and automated tools always need a human to confirm, but they are great tools to aid you in your work.

The Tools

HTML Code Sniffer

HTML Code Sniffer is the tool I use most often as it gives me a quick overview of how inaccessible a page likely is. It sits as a bookmarklet in my browser and at the click of a mouse will analyze any page I have up for Section 508 or WCAG2.0 A, AA, and AAA. The tool returns known errors, warnings and notices complete with a description of it, links to the W3C website, and a snippet of the code it’s looking at.


Tota11y is another bookmarklet based tool that helps to confirm things Code Sniffer has found and to identify other issues. Particularly useful are its ability to identify possible contrast issues, lay out the heading structure, and identify ARIA landmarks. It also has a feature called Screen Reader Wand that seems to be a reasonably good emulator for what a screen reader would see. Tota11y will report errors it finds as well, but lacks some of the details Code Sniffer provides.

Web Developer Tool Bar

I have long used the web developer tool bar in developing websites, but have recently found it to be very useful for helping see what a page looks like when you strip out all formatting and styles. I usually use it to turn off all styles and also hide images and background images but leave alt text on. This gives me a relatively good overview of what a page might look like on the front end to a screen reader or search engine.

Colour Contrast Analyser and WebAIM Color Contrast Checker

OK, this is two tools, but they both have uses that I find complement one another. First is the Colour Contrast Analyser from the Paciello Group and the second is WebAIM’s Color Contrast Checker. The first requires a download but is worth it for the ability to use droppers to select colors (easy to find the exact color you want). It will immediately tell you whether your colors pass at AA and AAA levels and gives the hex code for color. If my contrast is not passing, I take the hex code over to the WebAIM tool and use their lighten/darken feature to find a color that is closest to the one that is causing the failure. There are many times this has allowed us to maintain a look imagined by the designer while also ensuring the site is accessible.

These tools are certainly not a comprehensive suite, and they do require a reasonable amount of background knowledge (as do all accessibility tools) to use for proper remediation work. However, along with a few other tools they form a solid base from which to work.

What tools and techniques do you find at the core of your accessibility work?

Course-based Learning Analytics at OLC Innovate 2016

We just received word this morning that our proposal titled “Investigating Course-Level Analytics in Online Classes” was accepted to the OLC Innovate 2016 conference.

The presentation builds on the work I am doing with colleagues Angelika Kraemer, Jessica Knott and Stephen Thomas here at MSU to explore how course level data can be used to promote high quality teaching and learning.

Some guiding questions we pose:

  1. How is your institution looking at learning analytics? What support is there for more focused, course-based work?
  2.    Are there data points in your LMS or elsewhere which can deliver information that would inform teaching, but which were not necessarily intended to provide that information?
  3.    How do you navigate issues of validity, statistical power, etc. when working with course level data? How can you make decisions with relative certainty, even if they are not widely generalizable.
  4.    What data and analysis tools do you need and how might you use them yourself or with faculty?