Digital Literacy Project
- sherry salek
- Jan 18, 2023
- 11 min read
This is my recent project that I was able to finish and here is the summary of my work using SPSS statistical tool.
PROJECT OBJECTIVES
Digital Literacy – The ability to use information and communications technology (ICT) or digital tools, commonly referred to as computer literacy or digital skills, is a component of workplace skills.
This project will implement digital literacy training for all employees and conduct research to gain an understanding of the barriers employees face related to information and communication technology.
Training Evaluation Method
Kirkpatrick Model
Measuring the effectiveness of the training program is an important but a complicated issue, since it needs to consider both the participants and the organizations’ satisfaction. We utilized the Kirkpatrick Evaluation Model to understand what is working (and what is not) in the training program and to be able to design and develop programs that meet the needs of the participants and the business. There are 4 levels of evaluation in this model to measure training effectiveness and improve our instructional design for future initiatives: Reaction, Learning, Behavior, and Results.

Participants’ reaction to the training is the first level of evaluation. This level measures how the participants in the training program feel about their experience. We created a survey to find out the participants’ overall experience. We used the feedback to help us identify areas for improvement and consider possible changes for future iterations of the training program.
KPIs:
- Employees’ feedback:
On their satisfaction level with training topics,
On technical issues,
On the quality of the presentation,
On the amount of information,
On the length of training sessions,
On the size of training group,
On the level of difficulty with training topics,
On whether their job role is relevant to the training topics,
On their desire to attend the future training sessions and their desired training content.
- The training completion rate: Some moderate group participants were transferred to the beginner group in the middle of the training.

We measured how much of the information employees are retaining. We used self-assessment questions to measure the difficulty level of the training topics for the participants, their level of understanding the topics before the training, the difficulty level of the training topics during the training, and how confident they are after the training.
KPIs:
- Information retention: follow-up assessment, input from managers and supervisors

We determined if participants are progressing by applying their new skills on the job. This process takes time, because it can take weeks or months for employees to build confidence or could apply their knowledge. The beginner participants’ supervisors were contacted to collect their feedback regarding the participants’ performance review after two months from the training session.
KPIs:
- Learner confidence: We estimated how confident the participants are in their newfound knowledge and abilities. We asked their supervisors’ opinion on this. The goal is to see whether there is a change in participants’ behavior.

We are going to check the overall results, and how they affect the digital literacy project objectives.
KPIs:
- Individual efficiency increases: working with supervisors, we need to see how productive each employee is now in digital literacy, versus how productive they were before their training.
- Average efficiency increase: how much more productive on average the employees have become after training.
Training Purpose for the Focus Group
A survey was distributed to all employees with four hundred twenty-two (422) responses. After reviewing the survey results and asking the supervisors’ opinions, 24 individuals were designated as The Focus Group Participants.
Focus groups are great tools for exploring what participants think and feel about training, and to get suggestions for future improvements.
The training that was developed and delivered in this project was designed to provide basic digital skills to employees with no or little digital skills and/or at lower-literacy levels to enable them to use the Information and Communication Technologies (ICTs) that will allow them to carry out their job-related tasks more efficiently and effectively. The desired outcomes for the participants in this program were to:
Understand basic computer terminology
Perform the basic functions of using a computer tablet
Know Web Browser features and how they are used with a Windows operating system
Improve their computer literacy in their current jobs
Meet a personal goal
Access other learning opportunities
All Focus Group participants have completed a ten-week live, but virtual training.
Methodology for The Focus Group
A survey was created online through Microsoft Forms. Fielding was conducted between October 4th and October 14th, 2022. We notified the supervisors about the survey and the instructions were sent to the participants.
We collected qualitative and quantitative responses. Pie charts are designed to help communicate the survey results.
Survey Purposes for The Focus Group
The purposes of the survey were to:
Identify the training satisfaction level,
Identify the technical issues of the depots during the training period,
Identify the quality of the presentation,
Identify the satisfaction level for the amount of information,
Identify the satisfaction level for the length of training sessions,
Identify the satisfaction level for the size of training group,
Identify the level of difficulty with training topics,
Identify whether the participants’ job role is relevant to the training topics,
Identify the participants’ desire to attend the future training sessions and their desired training content,
Identify areas of strength and opportunities for improvement.
Data Collection Technique
Data collection is instrumental in understanding how each participant’s knowledge, skills, and abilities can be improved to meet organizational results. We collected the required data and information through three methods:
Surveys: To conduct a survey, we prepared a questionnaire and circulated among the participants. The responses from the survey are associated with numeric scores and used for objective analysis.
Observations: It is a subjective method that used for data collection. The trainer observed the participants’ engagement in the training sessions or supervisors examined participants’ performance on the job to identify gaps in their skills/behavior.
Interviews: we interviewed the supervisors to collect qualitative data and allowed us to gather properly explained answers that underlined the participants’ experiences, subjective perspectives, and emotions.
Survey Response Rate for The Focus Group
The overall response rate was 85% (22 of 26 total participants).
Training completion rate for The Focus Group:
The training completion rate is 31% (8 of 26 participants attended all their training sessions).
The respondents consisted of 10 females and 12 males out of 22. More than half or 55% of the respondents were males.

36% of respondents have been working more than 15 years for the organization.

Summary of Survey Responses for The Focus Group
1. Overall, how satisfied are you with the digital literacy training?

Interpretation: From the chart, it shows that a large number of respondents as 73%, were very satisfied and somewhat satisfied with the digital literacy training. 23 % of the respondents were very dissatisfied and somewhat dissatisfied. Finally, a very small number of respondents or 5% were neither satisfied nor dissatisfied. The satisfied respondents mostly liked the training for the following reasons:
The concept of learning how to work with computers and laptops in their job and daily life.
Building confidence in handling electronics and knowing the fact that other people also make mistakes.
Interacting with different participants from other depots.
Step by step, detailed, and interesting training topics.
The training creates long-term commitment in our staff, which results in higher employee retention and better performance at work. (Supervisor’s opinion)
Helpful and knowledgeable Trainer.
From the survey, 27% of the respondents who were neither satisfied nor dissatisfied, somewhat dissatisfied, and very dissatisfied, disliked the digital literacy training. The reasons indicated by the survey responses and the participants’ supervisors include:
The participants’ difficulty with catching up and not being able to understand the topics.
Not indicating learning goals and expectations.
Lack of interest.
Large group of people with different levels of knowledge.
Participants need to be familiar with Zoom before training sessions start.
2. Did you experience any technical issues?

Interpretation: From the chart, it shows 55% of the respondents had experienced technical issues, while more than two fifths or 45% had no technical issues. More than half of the participants in the survey indicated technical issues during their training. Some of their main technical issues were the tablet screen size, the trainer’s sound quality, not functioning tablet or other devices during the training, and unfamiliarity of the respondents with the tablet.
3. How clear was the presentation of information?

Interpretation: The chart shows that approximately three quarters or 77% of the respondents thought the presentation of information was clear or somewhat clear, while 23% found the presentation of information was not clear at all. Most reported presentation quality issues by 23% respondents include not establishing rapport and proper attention to a participant due to a big-sized training group with different levels of ability, difficulty in understanding the trainer due to the fast pace of teaching, interrupting the trainer to ask questions, and not indicating the goals and expectations of the training.
4. How do you feel about the amount of information presented?

Interpretation: The chart shows that 63% of the respondents thought the training sessions had the right amount of information presented, while 23% found there was too little information and 14% thought there was too much information. A significant proportion of the survey respondents thought the amount of information provided in the training sessions was right. Around more than a third of the respondents think, there was too much, or too little information and they felt confused, frustrated, and not confident.
5. How did you feel about the length of each session?

Interpretation: The chart shows that more than half or 55% of the respondents thought the length of training sessions was satisfactory, while 45% thought it was long. The average of training sessions was about 02hours:30 minutes. More than two fifths of the survey respondents thought the training sessions were long. None of them selected short training sessions.
6. How did you feel about the size of your training group?

7. What is the best size of training group for you?

Interpretation: The chart shows that 64% of the respondents were satisfied with the size of the group, while 36% thought there were too many participants. None selected a small-sized group.
There were 24 beginner participants. Some participants were added in the middle of the training and the number of participants who attended the sessions was different. Approximately, around two thirds of the survey respondents were satisfied with the training size. None of the survey respondents selected short training sessions. Exactly half of the ones that were not satisfied with the size of the training group felt one-on-one training works for them and 38% preferred 2 to 5 people and 13% liked 5 to 10 people.
Recommendations:
We should identify the participants’ training need to set the training goals that align with the organizational objectives and participants’ aspirations. Setting training expectations and reminding them to the participants each time will help the overall efficacy of the program. To do so, we need to take the following steps:
Identify what we are trying to achieve: We are going to obtain a deeper understanding of the barriers faced by employees related to ICT and of how these barriers influence employees’ confidence, knowledge and experience and the skills they bring to the workplace. Moreover, we intend to gain an understanding of the effectiveness of a host strategies and solutions that can be applied, introduced, or provided to individuals experiencing barriers. We also hope to answer: How can we as an employer (and other employers) assist in the development of ICT knowledge, skills and confidence in employees that enter employment without already having these skills? What impact does access to technology out of the workplace have on the development of essential workplace skills? We are seeking to learn from employees that have a knowledge gap and have little to no access to technology away from work.
We should create a list of knowledge, skills, and abilities that the participants have through surveys, observations, and supervisors’ opinion before designing the learning programs. We can talk to the participants or their supervisors and make it clear that we are really interested insetting goals and objectives for training that match participants’ needs.
Evaluation and feedback are also important parts of the training process. We should conduct a pre-training survey to:
Provide better insights into the expectations of the participants.
Understand the participants’ level of knowledge, backgrounds, and abilities.
Highlight the aims, objectives, and usefulness of our training.
It’s best to solicit this type of feedback from participants as soon as each session is finished so that the information is fresh in everyone’s minds. We can use online post-training surveys or questionnaires for efficiency. We should conduct a post-training survey to:
Track the progress of the participants.
Assess our training goals and objectives.
Gather actionable feedback and suggestions from the participants.
We should divide the training group to smaller sized group to improve communication, increase engagement and productivity, boost collaboration. We can create beginner, intermediate and advanced group and include ongoing training. Instead of presenting a generic training session to all the participants, we can customize learning opportunities for each group. We can offer one-on-one meetings at the end of training sessions and set a schedule before the training finishes.
We should build more self-esteem. Participants understandably want to know what’s in it for them. We should create a training program to build the participants’ self-worth and self-esteem. Allow for mistakes and use empowering language to help participants develop a positive attitude toward learning. Encourage them to practice and explain that making mistakes now in training is much better than not knowing how to act in work and real-life situations. It’s a good idea to add encouraging or explanatory feedback after completing any activities. Phrases like “Well done!” or “You’re doing great!” or “You’ll nail it next time!” will help participants overcome their fears and keep trying.
We can include more gamified elements in the training topics to provide the participants with a sense of accomplishments and engagement. A few ideas on how to make training fun include: offering incentives for attendance such as Tim Horton gift cards, gamify the training session and let employees earn points toward future rewards, using games and videos.
We should consider that supervisors play a critical role in the participants’ training performance. Supervisors can: identify training needs, provide direction to the nature and scope of the training program, supply relevant content, evaluate the ultimate effectiveness of a training program.
The training should be an opportunity for the participants, and it should not be something that they have to do. The participants should be reminded about the benefits of the digital literacy at their workplace and in their daily lives. There are many benefits such as local online grocery shopping, online banking, use of personal monitoring devices for their healthcare, safety, and security, connecting with family and friends, and online games with others, etc.
The training should be with much smaller groups, for shorter periods of time. In a perfect scenario, having short one-on-one sessions with everyone would build trust between the trainer and trainee. Having these shorter sessions individually for a few weeks, followed by larger, longer group sessions to finish off the learning cycle might be one way to go. Confidence is huge for our employees, so building up that confidence before throwing them into a group situation will lead to a much higher success rate in my opinion.
To create the digital literacy curriculum, we should follow these steps:
Identify the participants
- Who are we going to be teaching? Employees who experience disabilities? Seniors who are not familiar with digital literacy but are realizing it is never too late to start learning?
- What are the barriers for each participant?
We can divide the participants into smaller groups according to their abilities.
2. Indicate the time frame
- How many weeks we are going to teach and how often for each group?
- Will the lessons be taught early in the morning or in the afternoon?
Early in the morning, participants often cannot engage with difficult concepts. Trainer can start out with some warm-up activities to get their brain working.
Trainer should pick a few core topics to explore starting with simple basic concept.
3. Specify the topics division
- How much time will it take to cover each topic? For example, for Outlook, will it take three hours for one session, and do we want to allocate some time reviewing it each session through activity?
- We should put the topics in order of how they would make sense to the participants. A common pattern is beginning with the easiest and most basic skills and finishing with the most complex material.
4. Find resources
- Do we want to create most of the lessons and need some information resources to help us along? Do we need a website with interactive modules that we can use for the lessons?
5. Plan each lesson and include interactive activities
- What type of teaching will the participants enjoy? Interactive activities? Group discussions? Will they need follow-ups and reminders after the classes?
- Include as much hands-on experience as possible to engage the participants. We can use quizzes and mini games to help us assess how participants acquire the knowledge and boost their engagement. True/False, Matching, drag and drop activities.
6. Make the topics accessible for everyone
- Will we have any participant with accessibility needs (physical, learning- related, etc.)? what can we do to accommodate these needs?
While planning the lessons, we need to make sure that participants who need accessibility would be able to attend the sessions.
Digital accessibility practices are important for participants’ needs. Some practices include:
Providing automatic captioning and transcripts for Zoom.
Distribute the training transcript and notes to all participants after the meeting.
Use clear, slow, plain language instead of complicated language.
Make sure text size and color contrast of font can be adjusted.
Ensure application accessibility.
We should consider accessibility training subjects in our training curriculum such as Cortana, NVDA screen reader, color contrast, audio closed captioning, etc.
We should maintain a good pace to keep the participants engaged and ensure progress is made. Too slow pacing makes the participants feel bored and too fast makes the participants feel overwhelmed and unable to keep up.
Comments