Keywords

1 Introduction

Researchers have explored the use of paper in work practices where complexity made the transition to or the use of technology difficult (Bishop 2002; Marcu et al. 2013, Piper et al. 2013; Turner 2010). Extensive research also exists in the area of information management (Bruce et al. 2004; Buttfield-Addison et al. 2012; Oh and Belkin 2011; Trullemans and Signer 2014). No previous research, aside from our prior user study, has been conducted to empirically investigate the use of technology versus paper when managing children’s educational information. Although the small group of participants from the first user study provided an indication of the effectiveness of MyStudentScope versus paper, a study with a larger group of users was needed to further validate the results. For this reason, we decided to conduct another study based on the lessons learned from our prior user study with a larger sample size. The study aimed at answering the following research questions.

  • Are parents able to complete information retrieval tasks more quickly using paper-based methods or MyStudentScope?

  • Are parents more frustrated completing information retrieval task using paper-based methods or MyStudentScope?

  • Are parents able to make decisions more effectively using paper-based methods or MyStudentScope?

  • What are the challenges for parents when using MyStudentScope to complete tasks?

  • How can we improve the design of MyStudentScope to better meet the needs of parents?

The design of the study was modified to address challenges that may have negatively impacted prior results. For example, the scrolling function was improved to make it easier to view all available data. The course selection function was also revised to improve selection efficiency.

The pre and post-test questionnaires that had been completed on paper during the preliminary studies were converted into four online surveys. Some of the tasks were also modified to reduce the amount of writing required by the participant to express his/her answers. The motivation for these changes was to decrease participant’s fatigue due to writing while completing the test, so that he/she would be willing to provide more complete and informative feedback to the survey questions.

Twenty-three (23) participants took part in this study. Situations parents encounter related to receiving or interacting with information regarding their children’s education and extracurricular activities were simulated during the study. A within-group design was adopted and each participant completed functionally equivalent tasks using paper-based methods and MSS. The task completion time, success rate, and perceived level of frustration were documented. Participants also provided feedback regarding their preferences and challenges related to the tasks when using MSS and the paper-based method. The results suggest that, compared to the paper-based solution, MyStudentScope significantly improved the efficiency and reduced the level of frustration for parents when managing educational information.

2 Method

2.1 Participants

Participants include 1) parents of students in grades Kindergarten through 12 that currently use a school-provided electronic student information system, 2) parents having children in grades Pre-Kindergarten through 12 and older children who have used a school-provided electronic student information system in the past and 3) parents of young children who may use a school-provided electronic student information system in the future. Overall, 23 parents having at least one child between the ages of 0–18 participated in the study (7 males and 16 females). Some of the participants also had children over the age of 18. Thirteen (13) of the participants were between the ages of 31 and 40 (average: 41, stdev: 8.01). The majority of participants have more than one child (95.45%). Figure 1 reflects the grade level distribution of the children of the participants. Four parents who participated in study 1 also participated in this usability study.

Fig. 1.
figure 1

Grade level distribution of the children of study participants

All participants have been using a computer, smart phone or tablet daily for more than ten years. Sixteen (16) of the participants have a school system-provided education management system available to them. The majority (13) of those with access to an education management system access the system at least once per quarter. Three of the respondents with access to a system do not access it. Table 1 shows the general demographic information for each participant. It includes answers to questions of whether or not an education management system is available to the parents through their child’s school and, if available, whether it is used by the parent.

Table 1. General demographic information for participants

2.2 Experimental Design and Procedure

A within-group design was adopted for this study. Each participant completed similar tasks related to the management and use of educational information for two students under two conditions: paper-based condition and MyStudentScope condition. The order of conditions was balanced to control the learning effect. 11 of the participants completed the tasks under the paper condition first and 12 completed the study under the MyStudentScope condition.

At the beginning of the study, the participants completed a pre-test questionnaire to provide information regarding their demographics, computer and information management experience and preferences. During the formal task session, participants completed a total of 24 tasks; 14 using MyStudentScope and 10 using paper. At the beginning of the MyStudentScope condition, each user was given a brief demo of the MyStudentScope web portal. A MyStudentScope user guide was also available to participants as a reference during the test. Upon completion of tasks for each condition, the participant was asked to complete a questionnaire regarding their satisfaction and frustration. Upon completion of all tasks participants were asked to complete a questionnaire comparing their experience using paper to MyStudentScope. All participants completed the tasks; however pre and post-test survey responses were only recorded for 22 participants. In general, each session lasted approximately 1 ½ to 2 h.

To avoid privacy concerns, four fictional student data sets were created for the study: Amelia, Jack, Emily and Oliver. Two of the test data sets represented high performing elementary school students; one female and one male (Amelia and Jack). The other two test data sets represented average performing elementary school students; one female and one male (Emily and Oliver). Each test data set included assignment grades; course/report card grades; samples of the student’s work; and communications, schedules and notices from the school and extracurricular programs. The data was organized in a paper folder and in MyStudentScope for each data set. Depending on the test data set, the paper folder contained between 105 and 140 pages. The documents included report cards, interim reports, sample assignments, extracurricular schedules and sign-ups for the current school year and school newsletters for the current school year. The documents were organized chronologically with the most recent documents on top. The electronic equivalents of the documents and/or information reflected in the paper documents were uploaded into MyStudentScope for each test data set.

Experiment Environment.

The study was conducted in participants’ homes. The website was hosted on a DigitalOcean cloud server. Participants used laptop computers owned by the test facilitators and the Google chrome browser to perform pre and post-test questionnaires and MyStudentScope tasks.

Tasks.

The paper condition consisted of 10 tasks. The MyStudentScope condition had 14 tasks. The map** of MyStudentScope and paper tasks to monitoring, communication, recovery and decision making functions is presented in Table 2. A single unpaired paper task asked participants what information they would use to remember their student’s accomplishment. The additional MyStudentScope tasks are specific to portal functionality and they all map to the monitoring function (see Table 3).

Table 2. Function to task map** for study conditions
Table 3. Function to task map** for additional MyStudentScope tasks

The tasks were presented as scenarios parents may face while their children are in school or participate in extracurricular activities. For MyStudentScope task 4, and corresponding paper tasks 1 and 2, a participant using the Emily test data set would be presented with the following task:

Emily’s teacher, Mrs. Keller, sent you the following message:

Dear Emily’s Parent,

The quality of Emily’s handwriting is poor. At times is it is difficult for me to read the answers on her assignments. Please work with Emily to improve her penmanship.

Sincerely,

Mrs. Keller

You believe Emily’s teacher is mistaken. Show the test facilitator evidence in MyStudentScope/the folder that you could use to support your belief that Emily’s teacher is mistaken.

3 Results

Twenty-three participants completed the study. All participants conducted 14 tasks under the MyStudentScope condition and 10 tasks under the paper condition. Task performance was measured through 3 variables: the time spent completing a task, the success rate, and the total number of pages visited to complete a specific task. Comparing the total number of pages visited with the minimum number of pages needed to complete a task can provide insight about the efficacy of the navigation design of the MSS web portal.

3.1 Task Completion Time

Among parents who participated in the final study (N = 23), a paired samples t test suggests that there is a significant difference between the MyStudentScope condition and the paper condition in paired tasks 3, 4, 7 and 8; the time it took to determine grade for specified grade level and marking period (t (8) = 5.36, p < 0.05) (Task 3), determine if there are schedule conflicts for specific date (t (8) = −4.73, p < 0.05) (Task 4), determine trends in student grades (t (8) =  −2.10, p < 0.05) (Task 7) and determining if a similar incident occurred in the past (t (8) = −6.28, p < 0.05) (Task 8).

The comparison between the times to complete paired tasks 3, 4, 7, and 8 using MyStudentScope and paper is presented in the graphs below (see Fig. 2, Fig. 3, Fig. 4 and Fig. 5). With the exception one participant’s completion time for paired task 4, all participants completed paired tasks 3, 4 and 8 in less time using MyStudentScope than paper. Paired samples t tests find no significant difference between the MyStudentScope condition and the paper condition in the time it took to complete the other tasks (Task 1: t (8) = −.79, n. s.; Task 2: t (8) = −.50, n. s.; Task 5: t (8) = .20, n. s.; Task 6: t (8) = 1.47, n. s.).

Fig. 2.
figure 2

Completion times (seconds) for paired task 3 - determine grade for specified grade level and marking period

Fig. 3.
figure 3

Completion times (seconds) for paired task 4 - determine if there are schedule conflicts for specific date

Fig. 4.
figure 4

Completion times (seconds) for paired task 7 - document trends about the student’s grades from K through the current year

Fig. 5.
figure 5

Completion times (seconds) for paired task 8 - determine if a similar incident occurred in the past

3.2 Failed Tasks

An indicator of the efficacy of using MyStudentScope to complete tasks versus paper is the number of failed tasks under each condition. A successful entry indicates that the participant was able to find the desired information and/or complete the required action. Failure means the participant found incorrect information, failed to complete the required action or indicated by task response that he/she was unable to determine the answer to the task.

The majority of the failures were observed when users attempted to determine if there are schedule conflicts for specific date and determine if a similar incident had occurred in the past using paper. Only one participant failed to complete one of those tasks using MyStudentScope. For all but that single instance, participants were able to successfully complete each task using MyStudentScope.

3.3 Pages Visited

An indicator of the efficiency of using MyStudentScope to complete tasks is the number of pages visited to perform each activity. In general, more pages visited indicate that the user did not know how to use the tool and was searching for the means to complete the task. In most cases this resulted in more time spent and therefore lower efficiency. An optimal path was defined for each MyStudentScope task. The optimal path consists of the minimum number of pages necessary to complete each task accurately.

The ratio between the number of actual pages visited and the optimal pages needed is an indicator of how effective the task is completed. Higher ratio indicates that users are substantially deviated from the optimal path. The lowest ratios were observed on three tasks: (a) determining if there were schedule conflicts for specific date (1.05), (b) recording an accomplishment (1.05), and (c) adding a new event to MyStudentScope (1.07). Most users navigated to the Events page and completed the task easily without any error. The highest ratio was observed on identifying and documenting trends in the student’s academic performance (3.05). Users should have been able to complete the task by visiting the Dashboard only, but some participants visited as many as 11 pages before completing the task (Fig. 6).

Fig. 6.
figure 6

Optimal and actual pages visited on average for each MyStudentScope condition task

3.4 Observed User Frustration

Observed user frustration was measured by comments made by the participant while completing each task as well as the participant’s body language. Non-verbal signs that signaled to facilitators that participants were frustrated included changes in breathing like sighing or long exhales, rubbing the back of the neck or shaking the head. Time taken to complete a task was not automatically assumed to factor in to a participant’s level of frustration because overall, they were very patient with completing task under both conditions.

The observed levels of user frustration for the MyStudentScope tasks with equivalent paper tasks were recorded. Based on observed behavior, the two most frustrating tasks were determining if there are schedule conflicts for specific date (Task 4) and determining if a similar incident has occurred in the past (Task 8) using paper. For these two tasks, 13 out of 23 participants had a high or very high observed level of frustration. This drastically contrasts with the fact that no participants experienced frustration at any level while completing paired task 4 using MyStudentScope. When completing the tasks, users made comments like, “I cannot figure out how to answer this!”, “[There are] a lot of paper to look through. This is a pain!” and “This is why we are stressed, right?”

Figure 7 is a depiction of the observed user frustration during the study. The width of the red lines indicates the number times the level of frustration was observed. Red lines in the lower left quadrant (unshaded area) indicate that participants showed low or no frustration completing tasks using MyStudentScope and paper. Red lines in the upper left quadrant (blue shaded area) indicate that participants showed more frustration using paper than MyStudentScope. Red lines in the upper right quadrant (unshaded area) indicate that participants showed high or very high levels of frustration under both conditions. Red lines in the lower right quadrant (gray shaded area) indicate that participants showed more frustration using MyStudentScope than paper. The very wide red line in the lower left-most box indicates that there were nearly 100 tasks for which no frustration was observed in the paper and MyStudentScope condition. The thin red line in the gray shaded area indicates that there were a few incidents where completing tasks using MyStudentScope was observed to be more frustrating than paper. The thickness and number of lines in the blue shaded area compared with those in the gray shaded area show that overall, using paper was more frustrating to user than using MyStudentScope.

Fig. 7.
figure 7

Observed levels of user frustration (Color figure online)

3.5 Preferences Based on Survey Responses

To understand the participants’ preference for managing information and technology experience, each participant completed a questionnaire before the test. Most parents indicated that they use both paper and technology to manage information. All participants agreed that managing information regarding their children’s education is important (Question 4). All also began the study with a positive opinion of the ease with which technology can be used to manage their children’s educational information (Question 5).

All participants answered a questionnaire after each test condition to evaluate their experience. Although users experienced some frustration with MyStudentScope due to their lack of familiarity with it, the majority of the participant feedback was positive in favor of the portal. By the responses to Question 1 the majority of participants, 19, agreed or strongly agreed that it was easier to use MyStudentScope than paper. The majority of participants, 20, also agreed or strongly agreed that they could be more productive using MyStudentScope than paper per response to Question 3.

4 Implications

The knowledge of the needs and preferences of parents when managing and using information regarding their children’s education can help designers create more functional information management tools to support them. This knowledge could also be applied to the design of electronic student information systems available in most school systems, thereby extending their functionality to support both the needs of parents and educators. When designing these tools, developers should keep the recommendations of experts in education in mind. Per the experts, parents need to document teacher phone calls, keep records of requests for appointments by the parent or teacher, keep copies of school work/assignments especially those with which that parent or teacher has expressed concern, keep copies of any official reports that have been signed and dated, keep children’s pre-school portfolio and retain baseline assessment results. Therefore any system built for parents should have a means for accepting and saving this information. Designers should keep in mind the reasons parents use the information they keep. This will drive the metadata parents are able to record with information saved in the system. Dates are particularly important because parents may be able to use a timeframe to recall or recovery information when needed. Designers should remember that parents may need to look across many years’ worth of educational data at one time to get a good understanding of the child’s progress. For this reason, graphical representations of the data should be designed and made available as much as possible.

Kee** in mind that the system is only as useful as the data in it, it is important for parents to remain diligent in recording information in MyStudentScope. The more information they add regarding grades, behaviors, and observations, the more clear the picture of their child’s academic progress will be. This is especially important when entering metadata about uploaded documents, grades or comments. The data is important, but the details associated with it like the date, subject area, comments about whether the data point reflects a positive or negative situation are invaluable to being able to search for and recover the data efficiently in the future. Parents’ awareness of the types of data they should retain regarding their children’s education and having a means to manage that information as a whole may motivate more parents to more regularly review their children’s academic progress. Having the ability to quickly detect trends and anomalies will also empower parents to be proactive in addressing concerns with respect to their child’s educational development instead of relying on educators to point out potential areas of concern. Taking action early may improve their child’s chances of educational success.

Educating children is team effort between the parent, student and educator. Informed, activated parents communicating effectively with educators will lead to improved outcomes in the child’s academic development. Parents’ use of MyStudentScope to remain aware of their children’s progress and identify areas of concern with tangible evidence will allow them to have more meaningful and effective conversations about issues with the child’s progress. Educators will benefit from parents’ ability to provide actual evidence to support their views regarding their children’s academic progress or concerns instead of having to weed through anecdotal thoughts that may be difficult or impossible to verify. This clarity in communication and identification of issues will enable educators to more quickly develop a strategy to address concerns raised by the parent. Parents and teachers will be able to track whether changes are leading to the expected results with respect to the child’s development.

5 Limitations and Future Research

The research only involved testing of novice users of MyStudentScope. The participants completed their interaction with MyStudentScope in only one session. In reality, parents must manage information regarding their children’s education over many years. As stated by many participants in their post-test survey responses, with more experience using MyStudentScope their productivity may improve. A longitudinal study of several weeks or even months is needed to understand the true efficacy of the MyStudentScope web portal versus the traditional paper-based approach. A six month time period might be ideal because it will cover approximately three marking periods or terms for most schools. It is possible that significant difference might be observed with some of the tasks as users gain more experience in MyStudentScope. In addition, the longitudinal study will also allow the researchers to observe the learning curve with the MyStudentScope web portal and examine how the interaction patterns and strategies evolve as users gain more experience in MyStudentScope.

The study was conducted using manually generated test data based on fictional students. Parents have greater familiarity with their own child’s academic performance, extracurricular activities and other factors that impact their educational development. Future studies are needed to investigate how parents use the MyStudentScope web portal in a realistic setting with actual data of their children. Those studies will allow the researchers to better gauge the effectiveness of the portal in managing the educational information.

Finally, the MyStudentScope web portal was designed and implemented as a traditional website. With the rapid development in mobile computing, more and more educators and parents have started to use mobile devices and applications to communicate, access, and manage students’ educational information. Compared to the traditional website, a mobile application delivered through a smart phone or other mobile devices could be easier to access in a variety of environments (e.g., work, public space) in addition to home. Another advantage of mobile applications is the alert and notification functions that are usually easier to check than emails. We plan to design and implement a mobile application that delivers similar functions of the MyStudentScope web portal.

6 Conclusion

The results of the comprehensive study are consistent with the results of the preliminary studies in demonstrating that MyStudentScope is a viable solution for improving the efficiency and efficacy of parental management and use of their children’s educational information. A significant difference in task completion time was achieved for half of the paired tasks completed using MyStudentScope and paper. User responses in post-test questionnaires, observed levels of user frustration and the success rates all show that using MyStudentScope is generally less frustrating and more effective. We plan to further examine the efficacy of the design and users’ interaction pattern through a long term field study.