digital literacy

In January 2015, ATD offered the webinar, Helping Learners Learn in the Digital Era, during which participants asked a number of thought-provoking questions. In addition to inquiring about how they, as learning professionals, could increase their own digital literacy, they wanted to know how to:

  • make a strong business case and sell leadership on the importance of a digitally literate workforce
  • create institutional support for digital literacy initiatives
  • motivate individuals to become more digitally literate.

Underscoring these queries were questions about assessing digital literacy, including: “Are there particular assessments to get a sense of skill set for a diverse organization—from production line to R&D engineers to sales and marketing?” or “Does everyone need the same skill set?”

In addition, one participant asked about ATD’s potential role in developing some kind of tool for assessing digital literacy.

Although it’s not practical to create a single measure to assess the digital literacy of workers at all levels and in all functional areas, it is possible to create a general measure to assess basic competencies. Here’s our take on developing a framework for creating this sort of measure.

Defining Digital Literacy

Before designing a way to measure digital literacy, it’s important to define it.

Both the American Library Association and Wikipedia provide solid working definitions of digital literacy. Here is Wikipedia’s definition: “A digitally literate person will possess a range of digital skills, knowledge of the basic principles of computing devices, skills in using computer networks, an ability to engage in online communities and social networks while adhering to behavioral protocols, be able to find, capture and evaluate information, an understanding of the societal issues raised by digital technologies (such as big data), and possess critical thinking skills.”

Although definitions like this are focused on digital literacy in a global sense, the core concepts can easily be applied to the context of work.

Assessing Digital Literacy—Four Components and Three Focal Areas

In general, workplace digital literacy is comprised of four hierarchical components. The first three focus on basic knowledge and understanding, as well as organizational and individual applications. The fourth component focuses on related skills and the ability to leverage digital technology effectively.

  • Digital era concepts. Focuses primarily on job-related communication and collaboration, such as platforms, channels, content creation and curation, crowdsourcing, cloud computing, and cybersecurity.
  • Digital tools and systems. Digital tools include the obvious—email, chatting/instant messaging, the Microsoft Office suite of products (and equivalents), as well as tools like photo and video editors. Systems include software applications developed for specific purposes, such as accounting, business intelligence, and learning management.
  • Social technology features, platforms, and tools. A sampling of social technology features include blogs, customized aggregators, dashboards and portals, discussion forums/threads, media sharing features, user-generated profiles, and wikis. Platforms and tools include obvious public networks like LinkedIn, Twitter, and YouTube, but also tools like Disqus, ShareThis, and privately oriented offerings such as Yammer, Jive, and Interact Intranet.
  • Digital engagement skills and tactics. This component focuses on the skills required to use social and digital technologies efficiently, as well as the necessary judgment to use them effectively. Examples include knowing the right channel to use for a given communication, using email productively, creating and engaging productively in discussion threads and forums, curating and validating content, contributing to a wiki, and understanding HTML basics.

In considering how to develop a general measure for assessing workplace digital literacy, there are three focal areas of primary concern for employers.

Advertisement
  • Communication and collaboration. The ability to communicate and collaborate with others using digital technology is critical in enabling an organization to function both efficiently and effectively. This includes internal communication with colleagues and external communication with clients and others.
  • Cybersecurity. Individuals have typically been found to be the weakest link in protecting an organization in cyberspace. Understanding the risks and engaging in the right behaviors creates a strong first line of defense against hackers, viruses, and other digital threats.
  • The law and ethics. Workers now have added responsibilities with respect to things like protecting an organization’s brand, intellectual property, and trade secrets; maintaining proper levels of confidentiality; and ensuring the privacy of clients, fellow employees, and other stakeholders. These responsibilities apply not only to their digital engagements while on the job, but also can extend to their non-work activities.

Building a Measure to Assess Digital Literacy

Any tool for assessing digital literacy should include questions that address the relevant components in the context of each of the three focal areas. Here are some examples:

Communication and collaboration:

  • Digital Era concepts: What is an enterprise social network? What is crowdsourcing?
  • Digital tools and systems: What is the main difference between texting and chatting/instant messaging?
  • Social technology features, platforms, and tools: What are the basic elements of a user profile?
  • Digital engagement skills and tactics: What is the best digital means for creating and conducting that conversation? What are the basic steps for adding a hyperlink to text or an image in a Microsoft Word, PowerPoint, or Excel file? Is it okay to express strong negative emotions via digital channels? 

Cybersecurity:

  • Digital Era concepts: What is phishing? What is hacking? How is malware different from a virus?
  • Digital tools and systems: What is the primary way an individual can expose an organization to malware or a virus?
  • Digital engagement skills and tactics:  If you suspect your computer has been infected by a virus or malware, what’s the first thing you should do? What’s a good password to use on your mobile device? If you access work systems remotely, should you make sure you’re on a secure channel first?

The law and ethics:

  • Digital tools and systems: How can trade secrets be leaked via email?
  • Digital engagement skills and tactics: If you’re discussing the organization or one of its brands on a public social network (for example, LinkedIn), should you disclose your working relationship? Is it appropriate to discuss clients on your personal social networks (for example, Facebook), even if you don’t name names? If you suspect a confidentiality leak, what is the first step you should take to report the loss or compromised information?

Additional design considerations:

  • Questions can be true/false, multiple-choice, and open-ended.
  • The assessment can include specific skill exercises, such as sharing a link to a web page or starting a discussion thread.
  • To the extent possible, questions should be contextually independent, although different versions of the same question may need to be created to adapt to different browsers (Internet Explorer or Chrome) and operating environments (Windows or Mac). Some answers may need to be tweaked to reflect an organization’s unique policies, procedures, and practices.
  • The assessment should probably be weighted toward skills and tactics, including assessments of judgment, etiquette, and ethics. In other words, it should address the question: What’s the right thing to do in a particular situation?
  • To motivate people and make the test more palatable, the assessment could take the form of a digital scavenger hunt, with gamification elements to reward people for correct answers and accomplishments.
  • Questions should have clear right/wrong answers to enable the creation of strong cutoffs between different competency levels, as well as to establish baseline measures and standards.

Your Thoughts?

Creating a tool for assessing digital literacy is certainly possible, but it will not be easy. There are plenty of nuances associated with the measurement components, and you will need to address lots of details. Not to mention, you will need to test the assessment to ensure the tool is both valid and reliable.

As you start to develop your assessment, be sure to ask:

  • Does this first cut at creating a general digital literacy assessment make sense to you?
  • What other components should be included?
  • What other focal areas should be considered? 
  • How would you tweak the design?

We welcome your feedback and ideas.