Mouse, keyboard and notebook

Faculty Working Group on Digital Presence and Public Scholarship

This fall I’m co-leading a faculty working group focused on digital presence and public scholarship. Our working group is a collaboration with the Colleges of Arts and Letters, Education, and Natural Science, and brings together about 20 faculty and academic staff from those areas. While the technical solutions provided through services such as Reclaim Hosting’s Domain of One’s Own and social media services such as Twitter and Medium are at the core of a lot of the work we are doing, they are not the focus. Instead, our focus is on critically thinking about one’s presence online and how that can work to amplify their scholarly work and engage others.

At the College of Arts and Letters, our goal for the larger initiative this working group is a part of is to:

Promote and support efforts by faculty and graduate students in the College of Arts and Letters to critically think about, create, and maintain a robust digital presence that amplifies and enriches their scholarship and enables them to engage the broader public.

We got off to a great start last week with an MSU Academic Advancement Network sponsored event led by Dean Christopher Long titled “Cultivating an Online Scholarly Presence” which featured Dean Long and several faculty members from around the University sharing their digital presence and discussing how they use it in their work. Then we followed up with the first of our weekly co-working sessions where we bring together faculty participants for an hour of conversation, questions, and work on their digital presence. We used the first sessions to focus on mapping out the pieces of their digital presence and identifying the core elements.

I heard some great questions/comments through our sessions and have shared a few below. These are great examples of the critical nature of the work we are doing.

  • How do I create workflows so things are easy to create and update.
  • Does working with website templates make it easier to achieve compliance with accessibility?
  • How often should one you update content?
  • How would using Twitter help me increase the reach of my scholarship?
  • How might I leverage the brand of the institution?

I’m looking forward to what the coming weeks have in store!

 

tools in a toolbox

A Quick Look at Tools I Use for Web Accessibility Work

A couple weeks ago I had an opportunity to present at the Michigan Association for Disability and Higher Education (MI-AHEAD) conference with my colleagues Nate Evans, Phil Deaton and James Jackson. We spoke about the great work Michigan State University is doing to promote accessible content in electronic information technologies (EIT), in particular our strategies around bringing awareness of accessibility into people’s workflow.

During a side conversation we were having I pulled up a website I’ve been working so they could weigh in on a question of color contrast. As we worked I realized that outlining some of the free tools I use on a daily basis for assessing website accessibility might be useful to others as well.

There are four tools I use regularly in my workflow. Below I’ve listed each with a brief description of how and why I use them. I use the browser-based tools with Google Chrome, but many of them work with other browsers too. One last note is that none of these tools are complete solutions in themselves and automated tools always need a human to confirm, but they are great tools to aid you in your work.

The Tools

HTML Code Sniffer

HTML Code Sniffer is the tool I use most often as it gives me a quick overview of how inaccessible a page likely is. It sits as a bookmarklet in my browser and at the click of a mouse will analyze any page I have up for Section 508 or WCAG2.0 A, AA, and AAA. The tool returns known errors, warnings and notices complete with a description of it, links to the W3C website, and a snippet of the code it’s looking at.

Tota11y

Tota11y is another bookmarklet based tool that helps to confirm things Code Sniffer has found and to identify other issues. Particularly useful are its ability to identify possible contrast issues, lay out the heading structure, and identify ARIA landmarks. It also has a feature called Screen Reader Wand that seems to be a reasonably good emulator for what a screen reader would see. Tota11y will report errors it finds as well, but lacks some of the details Code Sniffer provides.

Web Developer Tool Bar

I have long used the web developer tool bar in developing websites, but have recently found it to be very useful for helping see what a page looks like when you strip out all formatting and styles. I usually use it to turn off all styles and also hide images and background images but leave alt text on. This gives me a relatively good overview of what a page might look like on the front end to a screen reader or search engine.

Colour Contrast Analyser and WebAIM Color Contrast Checker

OK, this is two tools, but they both have uses that I find complement one another. First is the Colour Contrast Analyser from the Paciello Group and the second is WebAIM’s Color Contrast Checker. The first requires a download but is worth it for the ability to use droppers to select colors (easy to find the exact color you want). It will immediately tell you whether your colors pass at AA and AAA levels and gives the hex code for color. If my contrast is not passing, I take the hex code over to the WebAIM tool and use their lighten/darken feature to find a color that is closest to the one that is causing the failure. There are many times this has allowed us to maintain a look imagined by the designer while also ensuring the site is accessible.

These tools are certainly not a comprehensive suite, and they do require a reasonable amount of background knowledge (as do all accessibility tools) to use for proper remediation work. However, along with a few other tools they form a solid base from which to work.

What tools and techniques do you find at the core of your accessibility work?

Course-based Learning Analytics at OLC Innovate 2016

We just received word this morning that our proposal titled “Investigating Course-Level Analytics in Online Classes” was accepted to the OLC Innovate 2016 conference.

The presentation builds on the work I am doing with colleagues Angelika Kraemer, Jessica Knott and Stephen Thomas here at MSU to explore how course level data can be used to promote high quality teaching and learning.

Some guiding questions we pose:

  1. How is your institution looking at learning analytics? What support is there for more focused, course-based work?
  2.    Are there data points in your LMS or elsewhere which can deliver information that would inform teaching, but which were not necessarily intended to provide that information?
  3.    How do you navigate issues of validity, statistical power, etc. when working with course level data? How can you make decisions with relative certainty, even if they are not widely generalizable.
  4.    What data and analysis tools do you need and how might you use them yourself or with faculty?