A < C .000
There were statistically significant differences in the following perceptions of the usefulness of digital technology: At school A, teachers’ evaluation scores were statistically significantly lower than the scores of teachers at the other schools in the following pedagogical practices: small-scale project work F (2,54) = 12.841, p = .000; practicing skills F (2,54) = 10,866, p = .000; small-scale products (like writings during one lesson) F (2,54) = 12.256, p = .000; net discussions related to the topic F (2,54) = 6.412, p = .003; and presenting information and support for illustration F (2,54) = 12.148, p = .000. Tamhane’s T2 post-hoc comparisons were used to calculate the differences between the schools.
Teachers were asked about the use of various digital applications and Internet services in their own teaching; there were no statistically significant differences between schools in how much they reported using various applications and the Internet.
Teachers were also asked about using digital technology in various pedagogical practices. In Table 5 , the means and SDs of all practices are presented. There were a few statistically significant differences in the reported use of digital technology.
The means and SDs of pedagogical practices with digital technology and statistical differences
School A ( = 16) | School B ( = 21) | School C ( = 18) | value | ||||
---|---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Mean | SD | ||
Large projects | 3.3 | 1.483 | 3.6 | 2.376 | 4.1 | 1.731 | |
Small-scale projects | 3.6 | .256 | 4.8 | .284 | 5.6 | .231 | A < C, .000 |
Students’ independent work | 2.1 | 1.289 | 2.6 | 1.359 | 2.8 | 1.689 | |
Students’ inquiry work | 2.6 | 1.408 | 2.9 | 1.590 | 3.3 | 1.638 | |
Students’ fieldwork | 2.0 | 1.033 | 2.1 | 1.315 | 3.0 | 1.534 | |
Virtual laboratory work and simulations | 1.6 | .957 | 1.6 | 1.284 | 2.2 | 1.200 | |
Practicing skills | 2.7 | 1.352 | 3.7 | 1.683 | 5.2 | 1.581 | A < C, .000 |
Small-scale products | 3.6 | .964 | 4.6 | 1.499 | 5.4 | 1.145 | A < C, .000 |
Discussion on the net | 1.7 | .873 | 2.6 | 1.690 | 2.8 | 1.555 | |
Presenting information and support for illustration | 3.5 | 1.414 | 4.6 | 1.962 | 5.4 | 1.243 | A < C, .001 |
The statistically significant differences were found in the following items: small-scale projects F (2,54) = 13.233, practicing skills, F (2,54) = 10.988, p = .000; small-scale products (like writings) F (2,54) = 9.084, p = .000; and information presenting and support for illustration F (2,54) = 5.934, p = .005. Tamhane’s T2 post-hoc comparisons were used for calculating the differences between the schools.
The results showed, first, that there were no statistically significant differences between schools in teachers’ self-evaluated digital competence, and that teachers evaluated their competence in basic digital application as being quite high (scale 1–5), such as using email (mean 4.7), searching for information on the Internet (mean 4.7), word processing (mean 4.4), loading files from the Internet (mean 4.2) and using the digital learning environment (mean 3.8). These formed a group of basic digital competence. The second group of applications were using spreadsheets (mean 3.2), digital image processing (mean 3.1), graphics (mean 2.9) and social forums (mean 2.9). The lowest means were in virtual meeting tools (mean 2.3), creating www-pages (mean 2.3), publishing tools (mean 2.2), writing a blog (mean 2.2), publishing www-pages (mean 2.0), producing information to wiki (mean 1.9), voice and music (mean 1.9) and programming (mean 1.4).
Figure 2 shows the means of teachers’ need for support and training for using digital technology.
Teachers’ need for support and training of digital technology
The evaluation of teachers at school A was that they needed both technical and pedagogical training less than teachers at the two other schools, and there was a statistically significant difference between schools A and B in need for technical training: F (2,54) = 9.993, p = .000; and in need for pedagogical training: F (2,54) = 12.719, p = .000, indicated with * in Fig. 2 .
Pupils were asked which applications they use at school. In Table 6 , the means and SDs of those applications in which there were statistically significant differences between the schools are described.
Means, SDs and statistical differences of digital applications and pedagogical practices used
School A ( = 44) | School B ( = 100) | School C ( = 31) | value | ||||
---|---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Mean | SD | ||
Using digital applications | |||||||
Using word processing | 3.7 | .544 | 2.8 | .857 | 3.0 | 1.000 | A > B,.000; A > C, .002 |
Using spreadsheets | 2.7 | .694 | 1.8 | .899 | 2.1 | 1.076 | A > B, .000 |
Using email | 4.0 | .590 | 2.4 | 1.066 | 2.8 | 1.098 | A > B, C, .000 |
Information search from the Internet | 4.0 | .549 | 2.9 | .993 | 3.3 | .945 | A > B, .000¸ A > C, .002 |
Publishing on the Internet | 2.5 | 1.045 | 1.9 | .968 | 2.5 | 1.434 | A > B, .002 |
Using social forums | 3.0 | 1.562 | 2.3 | 1.228 | 3.3 | 1.137 | C > B, .000 |
Using learning environments | 3.5 | .952 | 2.6 | .998 | 2.3 | .973 | A > B, C, .000 |
Publishing in a web blog | 3.0 | 1.137 | 1.6 | .960 | 2.2 | 1.267 | A > B, .000 |
Publishing pictures, texts or reports | 2.5 | 1.000 | 1.9 | .988 | 1.9 | 1.221 | A > B, .002 |
Pedagogical practices with digital technology | |||||||
Developing my thoughts about the topic in a collaborative discussion | 2.7 | .851 | 1.7 | .886 | 2.0 | 1.251 | A > B, .000 |
Teacher guidance through the net for independent learning | 2.4 | .868 | 1.7 | .949 | 2.5 | 1.312 | A > B, .001 |
Freedom to surf in the Internet when assignments are done | 3.4 | 1.203 | 2.7 | .973 | 3.7 | .965 | A > B, .004, B < C, .000 |
Contact with pupils in other schools via email or the Internet | 3.0 | 1.285 | 2.2 | 1.242 | 2.8 | 1.440 | A > B, .001 |
Information search from the Internet | 3.9 | .443 | 2.9 | .865 | 3.5 | .890 | A > B, .000 B < C, .004 |
Publish pictures, texts of reports | 2.5 | 1.000 | 1.9 | .988 | 1.9 | 1.221 | A > B, .003 |
The statistical significance of differences in means between the pupils of schools was analysed by using one-way ANOVA. The analysis indicated statistically significant differences in the means in the following items: using word processing: F (2,172) = 18.909, p = .000; using spreadsheets: F (2,172) = 16.686, p = .000; using email: F (2,172) = 38.490, p = .000; using social forums: F (2,172) = 9.940, p = .000; publishing in a web blog: F (2,172) = 22.253, p = .000; using learning environments: F (2,172) = 17.316, p = .000; publish pictures, texts or reports: F (2,172) = 5.811, p = .004; develop my thoughts about the topic in a collaborative discussion: F (2,172) = 14.735, p = .000; teacher guidance through the net for independent learning: F (2,172) = 9.678, p = .000; freedom to surf in the Internet when assignments are done: F (2,172) = 15.361, p = .000; and contact with pupils in other schools via email or the Internet: F (2,172) = 8.367, p = .000; information search from the Internet: F (2,172) = 22.464, p = .000; publishing in the Internet: F (2,172) = 7.281, p = .001. Tamhane’s T2 post-hoc comparisons were used for calculating the differences between the schools.
There was also a difference in the statement about the use of ICT during leisure time for school work, in which pupils at school A had higher scores than pupils at the other schools. The statistically significant differences were between school A ( M = 3.7, SD = .553) and schools B ( M = 2.3, SD = .833) and C ( M = 2.2, SD .956) ( F (2,172) = 55.259, p = .000).
Pupils at all three schools liked to use ICT at school, and there were no statistically significant differences concerning the statements measuring this: the use of ICT is easy ( M = 4.2, SD = 1.034), the use of ICT makes learning more interesting ( M = 3.9, SD = 1.111) and pupils would like to use ICT more at school ( M = 3.8, SD = 1.192). Furthermore, there were no statistically significant differences in the use of technology at home and during leisure time.
Pupils also evaluated their competence in using various digital applications. The statistically significant differences in means and SDs between the pupils from the three schools are described in Table 7 .
Pupils’ self-evaluated digital competence in some applications (means, SDs and statistical differences)
School A ( = 44) | School B ( = 100) | School C ( = 31) | value | ||||
---|---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Mean | SD | ||
Word processing | 4.5 | .504 | 4.0 | .953 | 3.5 | .926 | A > B, .000 A > C, .000 |
Spreadsheets | 4.0 | .731 | 3.0 | 1.303 | 2.9 | .806 | A > B, .000 A > C, .000 |
4.9 | .321 | 4.6 | .680 | 4.2 | 1.036 | A > C, .000 | |
Writing in web blog | 3.9 | .830 | 2.9 | 1.463 | 3.0 | 1.390 | A > B, .000 |
Virtual learning environment | 4.4 | .542 | 3.8 | 1.170 | 3.4 | 1.174 | A > B, .000 A > C, .000 |
The differences were analysed by using one-way ANOVA. No differences were found in applications which tend to be less used in schools, such as digital image processing, publishing tools, voice and music applications or programming. The analysis indicated statistically significant differences in means between pupils of participating schools in the following items: word processing F (2,172) = 13.287, p = .000; spreadsheets F (2,172) = 15.092, p = .000; email F (2,172) = 10.002, p = .000; information search from the Internet F (2,172) = 6.492, p = .002; writing a web blog, F (2,172) = 9.441, p = .000; and using virtual learning environments F (2,172) = 9.042, p = .000. Tamhane’s T2 post-hoc comparisons were used for calculating the differences between the schools.
In Table 8 , the results of the separate data sets have been integrated and scored for each school.
Evaluated level of practices in each school
Phenomenon investigated | School A | School B | School C |
---|---|---|---|
A. Vision of the school | 2.3 | 1.3 | 2.3 |
A1. The vision of using digital technology | 2 | 1 | 2 |
A2. Consensus about the vision | 2 | 1 | 2 |
A3. Intentional development-orientation | 3 | 2 | 3 |
B. Leadership | 2.7 | 2.0 | 3.0 |
B1. Shared leadership | 3 | 2 | 3 |
B2. Networking of the principal | 2 | 1 | 3 |
B3. Role of the principal | 3 | 3 | 3 |
C. Practices of teaching community | 3.0 | 1.7 | 2.7 |
C1. Pedagogical collaboration and sharing of expertise | 3 | 1 | 3 |
C2. Development practices | 3 | 2 | 2 |
C3. Networking of teachers | 3 | 2 | 3 |
D. Pedagogical practices | 2.5 | 1.5 | 2.0 |
D1. Perceptions of using digital technology in education | 2 | 2 | 2 |
D2. Pedagogical practices with digital technology | 3 | 1 | 2 |
E. School-level knowledge practices | 2.5 | 1.0 | 2.0 |
E1. Common knowledge practices with technology | 3 | 1 | 2 |
E2. Physical premises | 2 | 1 | 2 |
E3. Students’ involvement in school level activities | 3 | 1 | 1 |
E4. School-level networking | 2 | 1 | 3 |
F. Digital resources | 2.75 | 1.75 | 2.0 |
F1. Utility of technical resources | 3 | 1 | 2 |
F2. Pupils’ digital competence | 3 | 2 | 2 |
F3. Teachers’ digital competence | 2 | 2 | 2 |
F4. Pedagogical and technical support | 3 | 2 | 2 |
The scores show differences between schools: schools A and C are ‘strong’ schools in several major elements. At school A, digital resources are at an especially high level, and in general, school-level working practices are at a high level. At school C, leadership practices and teaching community practices are at a high level. School B has the lowest scores in every major element. In the ‘ Discussion ’ section, we will discuss about the differences more in detail.
In the study, we investigated the practices at three schools based on six elements defined in the innovative digital school model. We aimed to find out, first, if those elements could help in defining good practices and suggestions for improvement for developing the schools with digital technology; and second, if the model revealed essential differences between the schools.
In order to answer the first research question about how the IDI school model helps to identify good practices and points to be improved in using digital technology for school change, we describe the practices of each school separately.
Among the characteristics of school A were advanced and established practices in shared leadership, practices of the teaching community, advanced pedagogical practices with technology and school-level knowledge practices, including involvement of pupils and systematic promotion of their digital competence through pedagogical activities. However, shared visions about digital technology were only emerging, teachers’ digital competence was only average and the perceptions in the pedagogical usage of technology had considerable variety between teachers, although there were examples of inspiring pedagogical methods. Teachers did not report needing support for using technology which probably indicates both quite a good level of digital competence and well-organised support practices in the school. Pupils’ self-reported digital competence was at a high level especially concerning basic applications. Pupils reported using technology quite often during leisure time for school-related activities, and at school for various basic activities, but also for collaboration and networking. Based on the results, the following suggestions for improvements can be made for school A: (1) the teaching staff should focus on crystallising and sharing the school’s visions in using digital technology as the basis for further development (elements A1 and A2); (2) teachers should share their pedagogical ideas and experiments, e.g. in organised meetings and workshops (elements C1 and C2); and (3) teachers should develop their digital competence, such as by making use of the training resources made available by the city and by organising school-level small-scale training (elements F2 and F4).
School B had some shared leadership practices and the principal was appreciated, but otherwise the school was not very advanced in any of the measures. Attitudes towards development efforts were positive, but established practices were lacking. There were teachers who collaborated with each other, participated in development projects and used digital technology in teaching in advanced ways, but activity was based on teachers’ own initiative and voluntariness. Especially at the school level, knowledge practices were minimal, both concerning the promotion of pupils’ involvement and digital competence, and school-level networking. Teachers at the school reported needing both technical and pedagogical support in using digital technology. For school B, based on the results, the following suggestions for improvements can be made: (1) it is important to create a common vision for developing the use of digital technology (element A1) and promote development orientation among teachers (element A3). (2) The principal and the management team should create and organise systematic common practices to carry out improvements in all developmental areas (elements in C). (3) The digital resources should be evaluated and developed (all elements in F) and especially teachers’ digital competence should be improved (elements F3 and F4).
School C represents a school with high-level leadership practices, and a strong collaboration culture both inside the school and in the active external networking of both the principal, teachers and the whole school. The school had a strong development orientation in general, but it had not yet become true in the school-level knowledge practices, digital resources or advanced practices of using technology in teaching. School C has much potential for improvement, and based on the results, the following suggestions for improvements can be made: (1) the usage of digital technology for school improvement should be more deliberate through agreements of shared visions (elements A1 and A2); (2) the school should create systematic development of pedagogical and knowledge practices (elements D and E); and (3) all pupils’ and teachers’ digital competence should be improved, both with pedagogical practices (element D2) and training and support (elements F2, F3 and F4).
To answer the second research question about how the model reveals essential differences in digital technology for school change, we compared the practices of schools by summarising the results of data analyses.
The results of the study indicate that there were some clear differences between the schools, although they also had a lot in common, especially in the principal’s role and teachers’ digital competence; common characteristics might be a result of common policies and practices of the city in these issues. Such elements, which are strongly dependent on school-level decisions, differed between the schools. Included here are teachers’ pedagogical practices and school community’s practices, including sharing of vision-level decisions. According to previous studies (Vieluf et al. 2012 ; OECD 2014 ), shared community-level practices are central to sustainable school improvement, but currently they represent practices which are not yet widespread in schools and require extending the teachers’ professional role beyond only taking responsibility for their own teaching in classrooms.
A clear difference between the three schools was in the presence or absence of practices involving pupils in school-level activities. Only at school A had shared, established practices for pupil engagement at school-level been developed, such as responsible pupil teams (e.g. media and environment teams) or pupils as guides in using digital technology. Various participatory practices presume seeing pupils in an active role in the classroom or at school, not only as objects of teaching during lessons (Facer 2012 ; Kehoe 2015 ; Pereira et al. 2014 ).
Also, the nature of pedagogical practices with digital technology differed between schools. At school A, pupils reported using digital technology more than pupils at the other two schools, both in the classroom and at home for school-related activities. The use focused on general applications and pedagogically ‘advanced’ practices, such as using a virtual learning environment and collaborating via the web. These practices probably helped to improve pupils’ basic digital competence: the regular use of digital tools was an essential condition for competence learning (see also OECD 2011 ; Aesaert et al. 2015 ). Furthermore, classroom practices were most advanced at school A and a comparisons of the teachers’ survey answers between the schools indicated that teachers at school A used and believed less in teacher-centred practices with digital technology than teachers at school C.
The innovative digital school model was not developed primarily for detailed comparisons of differences between schools. A more useful approach is to examine school profiles: the shape of the profile demonstrates the emphasis on the practices inside a school, and the level of the profile elements helps each school to position its strengths and development needs compared with reference schools. Figure 3 presents the results of Table 9 in a visual form illustrating the profiles of the three schools investigated.
A summary of the scores of the three schools in the elements of the IDI school model
The analysis framework of the phenomena and the data
Investigated phenomenon | Dimensions of the phenomenon | Data sources |
---|---|---|
A. Visions of the school | ||
A1. Visions of using digital technology | 1. No clear visions 2. Emphasis on technical issues, like increasing equipment 3. Using digital technology for overall improvement | Teacher interviews, principal interview |
A2. Consensus about the vision | 1. No common vision 2. Emerging; vision not present in daily work 3. Consensus of the vision; the vision is important for the school | Teacher interviews, principal interview |
A3. Intentional development orientation | 1. No emphasis on development efforts 2. Individual initiatives supported, positive attitudes towards change 3. Focused collaborative development practices, the whole community accepts and participates | Teacher interviews, principal interview |
B. Leadership | ||
B1. Shared leadership | 1. Principal-centred community, no teams 2. Occasional teams or teams based on voluntary participation 3. Commonly agreed teacher teams, true responsibilities | Principal interview, teacher interviews |
B2. Networking of the principal | 1. No networking or only for administration 2. Networking with colleagues and administration, mainly with the same educational level 3. Active networking with various kinds of educational institutions and actors outside educational field | Principal interview |
B3. Role of the principal | 1. Mainly routine management 2. Good human resources leader, positive for development but not proactive 3. Organiser, developer of resources, initiator of improvement | Teacher interviews, principal interview |
C. Practices of the teaching community | ||
C1. Pedagogical collaboration and sharing of expertise | 1. Occasional collaboration between teachers of same subjects or class levels; material shared between a few teachers 2 Collaboration between teachers of same subjects or class levels; experiences shared occasionally in the school 3. Organised pedagogical collaboration and sharing practices | Teacher interviews, principal interview |
C2. Development practices | 1. No collaborative development practices 2. Occasional development activities based on active individuals; freedom to develop 3. Established collaborative and individual development practices | Teacher interviews, principal interview |
C3. Networking of teachers | 1. No networking or few teachers are networking 2. Several teachers have networks, but mainly with colleagues of the same subject 3. Several teachers active in networks, various types of contacts inside and outside school | Teacher interviews, teacher questionnaires |
D. Pedagogical practices | ||
D1. Perceptions of using digital technology in education | 1. Technology replacing teacher’s routines or for small-scale content learning 2. Technology as pupils’ tool for preparing and presenting pieces of work and for information search; emphasis on individual learning 3. Technology for diverse collaborative and creative learning activities | Teacher questionnaires, teacher interviews |
D2. Pedagogical practices with digital technology | 1. Technology used in a teacher-centred way, content learning activities, applications related to textbooks or teacher presentations 2. Technology used according to the teacher; learner-centred activating tasks in individual lessons, short (one or two lessons) individual or small group activities, teacher-directed assignments 3. Teachers use technology in multiple ways; process-type activities and integrated projects; technology as a tool, but also used to improve digital competence | Classroom observations, teacher interviews, teacher and pupil questionnaires |
E. School-level knowledge practices | ||
E1. Common knowledge practices with technology | 1. No or limited common practices 2. Some shared practices or agreements, concern mainly technology 3. Agreements, models and guidelines related to various knowledge practices and competencies | Teacher interviews, principal interview, classroom observations |
E2. Physical premises | 1. Inflexible spaces mainly for class teaching 2. Various types of spaces, but not enough flexibility and possibilities 3. Premises planned according to versatile pedagogical needs | Teacher interviews, classroom observations, principal interview |
E3. Pupils’ involvement in school level activities | 1. No involvement other than the traditional pupil’s role 2. Occasional and emerging activation of pupils 3. Several and various types of pupils’ involvement and responsibilities | Teacher interviews, classroom observations, principal interview |
E4. School-level networking | 1. No networking 2. Some networking, related to specific issues or individual teachers 3. Systematic, established contacts and collaboration partners | Teacher interviews, principal interview |
F. Digital resources | ||
F1. Utility of technical resources | 1. Centralised, insufficient resources, not working properly 2. Resources decentralised but insufficient 3. Good resources, technology decentralised to various spaces, various types of equipment | Teacher interviews, classroom observations, principal interview |
F2. Pupils’ digital competence | 1. Pupils’ digital competence based on informal learning outside school; no plans or activities to support it 2. Pupils’ digital competence supported by a specific course or some individual teachers; not provided for all pupils 3. Digital competence is systematically supported; strategies about teaching digital skills in various subjects and grade levels | Pupil questionnaires, teacher interviews, classroom observations |
F3. Teachers’ digital competence | 1. Digital competence varies, competence improvement based on individual decisions, no common lines, focus on technical skills 2. All teachers have basic competence, focus on technical skills 3. All teachers have multiple types of digital competence, but the level varies; focus on the pedagogical use of technology | Teacher questionnaires, teacher interviews, principal interview, classroom observations |
F4. Pedagogical and technical training and support | 1. Some teachers responsible for technical support, no organised pedagogical support 2. Support organised but not sufficiently; focus on technological support 3. Well-organised support; in technical problems help easily available; also pedagogical support available | Teacher interviews, teacher questionnaires, principal interview |
a The data sources are listed in the order of importance
The profiles demonstrate the differences between the schools: school A has quite advanced practices in all elements; school C is high in school-level practices involving teachers and the principal, but only average in practices directly affecting pupils; and school B is least-developed in all elements, but highest-developed in leadership and digital resources. We propose that one reason for the differences between schools is the level of vision and how well it is shared among teacher community. Schools A and C had remarkably higher scores in the elements of goals and the vision compared with school B (although even schools A and C could improve on this). These results are in line with previous research according to which an explicated and shared vision is a key element in school improvement and change (see, e.g. Senge et al. 1994 ; Antinluoma et al. 2018). At school B, the vision and goals, pedagogical practices with digital technology and school-level knowledge practices were all at a low level, although the digital resources are almost the same as at school C. For benefitting from digital technology in improving pedagogy, collaborative visions and efforts especially focusing on that are needed (Laurillard 2008 ); technology does not change pedagogical practices per se, which describes the situation at school B. At both schools A and C, the elements related to vision, leadership and teacher community received good or even high scores, but school A was more advanced in pedagogical practices with technology. It seems that to develop high-level pedagogical practices with technology, deliberate effort is needed.
Validity of the innovative digital school model.
The two aims of the IDI school model, to reveal good practices and points for development, as well as to expose differences, were fulfilled, from which we interpret that analytic generalisation (Yin 2014 ) from the model is possible. With qualitative data (classroom observations and interviews), we were able to identify new and innovative practices in the school context, developed in the schools for their individual needs. The quantitative data supported the findings based on qualitative data. Innovative practices were found, especially at the school which was evaluated as being the most advanced in all elements. One of the schools was least-developed in all the measures investigated, and the third school was in between: it had a strong development culture generally, but the focus of the development work had not been on using digital technology as a vehicle for change. In the latter two schools, digital technology was taken into use by individual teachers and often without integrating pedagogy and technology.
The IDI school model as a framework for investigating differences worked particularly well for those elements which are mainly the responsibility for leadership inside a school (visions of the school, practices of teaching community and school-level knowledge practices); there were clear differences in these between the schools, especially according to the qualitative data. The three schools had differences even though they each follow the same curriculum, and the same detailed legislation. The teachers’ educational background is homogeneous, and the schools are located in the same city, which is responsible for providing the resources for all the city’s schools. The role of the city probably explains why there were no statistically significant differences between teachers’ self-estimated digital competence and the use of digital technology in general.
Results of the qualitative and quantitative data were somewhat contradictory in the use of digital technology in classrooms. In the teacher surveys, there were no statistically significant differences between schools, but there were in the pupil surveys. Our explanation is that pupils use technology in some lessons so much that it affects the overall experience, and that pupils in 9th grade use technology more than pupils in lower grades.
Another contradictory issue in the surveys was the result of pedagogical practices. The teachers participating in the observations and interviews were probably more interested in digital technology and their practices were more advanced than the practices reported in the survey by many more teachers. As Kivinen et al. ( 2016 ) suggested, the technology use of the majority of teachers might represent the use of technology per se, which leads to a pragmatic solution in which technology does not support a knowledge creation approach in learning but is used for practical experiments and learner-centred activities.
The schools that were examined are located in areas of different socioeconomic backgrounds. The results do not show differences based on the background, which probably indicates the homogeneity of Finnish schools. All schools receive the same resources from the city, and parents do not make financial contributions for the education. The school from the area of lowest socioeconomic status has participated in various projects during years, and this has promoted the capacity of the teaching staff. Teachers’ development orientation has supported the school to develop advanced practices regardless of challenging socioeconomic background of the pupils.
The results of the study proved that mixed methods are needed when investigating the practices of a whole school. Using only the survey data would not have revealed some of the central differences between the schools and would have given a quite narrow view of the situation at each school. For the qualitative data, it would not have informed about the use of digital technology and the competence in using it. Collecting qualitative data requires more resources than using only surveys. However, we experienced that our data collection model (five teacher interviews and lesson observations, a principal interview and a survey of teachers and highest grade of pupils) was a reasonably inexpensive and valid way to examine the practices of a school.
The IDI school model is an attempt to address the need for practice-oriented methods that help schools and teachers to reflect on their own practices and improve them (Angelides et al. 2004 ), and to narrow the gap between empirical research and practical school work (Wikeley et al. 2005 ), especially related to the change processes of implementing new digital technologies in education.
The IDI school model can be used in schools as a shared conceptual framework for collective reflection, discussion and strategy planning. We have already had some promising experiences about using it in the in-service training of teachers and principals. The model can also be applied to collect best-practice examples from different schools and disseminate them to other schools, or to make school visits and benchmarking of practices more systematic.
At the municipal and national level, educational administrators may have an interest in evaluating the status of using digital technology in schools. As our study witnessed, quantitative data have limitations in describing collaborative pedagogical and working practices. Qualitative methods are important, but there is a need for accessible methods for collecting data widely about the current state of art in schools. The methods, experiences and results of the present study can work as a starting point for developing scalable methods.
As a policy-level implication, we suggest that local and national school administration focus on schools as knowledge work organisations when aiming to improvements, such as to increase the quality of pedagogical and knowledge practices with digital technology in schools. We suggest that all elements of the innovative digital school model be considered, and that the start should be committing the staff to change, by creating shared visions and aims about pedagogical development through digital technology, and by supporting school-level practices including both pupils and teachers.
In the present study, we used data from three schools to examine the applicability and validity of the IDI school model for evaluating the development of schools through digital technology. All three schools were in the same city and had similar municipal resources for digital technology and in-service teacher training, which allowed differences to be revealed, especially in those practices that schools can influence individually in that context. In future research, it would be important to test the model with a larger collection of schools from different contexts (size, location, socioeconomic background, etc.) and from different countries and cultures, thus also confirming the validation of the model.
Another interesting line of research would be to conduct studies in which the development of the same schools was followed longitudinally. Such studies could include interventional aspects: the investigated schools would get feedback and support from researchers to develop their practices further, and new data would be collected after some period for evaluating the influence of deliberate development efforts.
In the future, schools will face even more challenges and requirements that the school community will have to answer. The best and most effective schools reflect their practices and constantly improve their ways of working. We believe that the innovative digital school model offers a tool for schools and for researchers involved in this work.
We are grateful to Jari Lipsanen for the help and guidance of the statistical analyses, to Ian Dobson who has, with patience and excellent comments, checked the language and to anonymous reviewers, whose comments helped to improve and clarify the article.
This work was supported by Tekes—the Finnish Funding Agency for Innovation [grant number 40233/09] and by City of Helsinki, Media Center.
Authors’ contributions.
Both authors contributed in the study equally, on the design, data collection and analysis as well as writing the article.
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Liisa Ilomäki, Email: [email protected] .
Minna Lakkala, Email: [email protected] .
Case-based learning.
Case-based learning (CBL) is an established approach used across disciplines where students apply their knowledge to real-world scenarios, promoting higher levels of cognition (see Bloom’s Taxonomy ). In CBL classrooms, students typically work in groups on case studies, stories involving one or more characters and/or scenarios. The cases present a disciplinary problem or problems for which students devise solutions under the guidance of the instructor. CBL has a strong history of successful implementation in medical, law, and business schools, and is increasingly used within undergraduate education, particularly within pre-professional majors and the sciences (Herreid, 1994). This method involves guided inquiry and is grounded in constructivism whereby students form new meanings by interacting with their knowledge and the environment (Lee, 2012).
There are a number of benefits to using CBL in the classroom. In a review of the literature, Williams (2005) describes how CBL: utilizes collaborative learning, facilitates the integration of learning, develops students’ intrinsic and extrinsic motivation to learn, encourages learner self-reflection and critical reflection, allows for scientific inquiry, integrates knowledge and practice, and supports the development of a variety of learning skills.
CBL has several defining characteristics, including versatility, storytelling power, and efficient self-guided learning. In a systematic analysis of 104 articles in health professions education, CBL was found to be utilized in courses with less than 50 to over 1000 students (Thistlethwaite et al., 2012). In these classrooms, group sizes ranged from 1 to 30, with most consisting of 2 to 15 students. Instructors varied in the proportion of time they implemented CBL in the classroom, ranging from one case spanning two hours of classroom time, to year-long case-based courses. These findings demonstrate that instructors use CBL in a variety of ways in their classrooms.
The stories that comprise the framework of case studies are also a key component to CBL’s effectiveness. Jonassen and Hernandez-Serrano (2002, p.66) describe how storytelling:
Is a method of negotiating and renegotiating meanings that allows us to enter into other’s realms of meaning through messages they utter in their stories,
Helps us find our place in a culture,
Allows us to explicate and to interpret, and
Facilitates the attainment of vicarious experience by helping us to distinguish the positive models to emulate from the negative model.
Neurochemically, listening to stories can activate oxytocin, a hormone that increases one’s sensitivity to social cues, resulting in more empathy, generosity, compassion and trustworthiness (Zak, 2013; Kosfeld et al., 2005). The stories within case studies serve as a means by which learners form new understandings through characters and/or scenarios.
CBL is often described in conjunction or in comparison with problem-based learning (PBL). While the lines are often confusingly blurred within the literature, in the most conservative of definitions, the features distinguishing the two approaches include that PBL involves open rather than guided inquiry, is less structured, and the instructor plays a more passive role. In PBL multiple solutions to the problem may exit, but the problem is often initially not well-defined. PBL also has a stronger emphasis on developing self-directed learning. The choice between implementing CBL versus PBL is highly dependent on the goals and context of the instruction. For example, in a comparison of PBL and CBL approaches during a curricular shift at two medical schools, students and faculty preferred CBL to PBL (Srinivasan et al., 2007). Students perceived CBL to be a more efficient process and more clinically applicable. However, in another context, PBL might be the favored approach.
In a review of the effectiveness of CBL in health profession education, Thistlethwaite et al. (2012), found several benefits:
Students enjoyed the method and thought it enhanced their learning,
Instructors liked how CBL engaged students in learning,
CBL seemed to facilitate small group learning, but the authors could not distinguish between whether it was the case itself or the small group learning that occurred as facilitated by the case.
Other studies have also reported on the effectiveness of CBL in achieving learning outcomes (Bonney, 2015; Breslin, 2008; Herreid, 2013; Krain, 2016). These findings suggest that CBL is a vehicle of engagement for instruction, and facilitates an environment whereby students can construct knowledge.
Science – Students are given a scenario to which they apply their basic science knowledge and problem-solving skills to help them solve the case. One example within the biological sciences is two brothers who have a family history of a genetic illness. They each have mutations within a particular sequence in their DNA. Students work through the case and draw conclusions about the biological impacts of these mutations using basic science. Sample cases: You are Not the Mother of Your Children ; Organic Chemisty and Your Cellphone: Organic Light-Emitting Diodes ; A Light on Physics: F-Number and Exposure Time
Medicine – Medical or pre-health students read about a patient presenting with specific symptoms. Students decide which questions are important to ask the patient in their medical history, how long they have experienced such symptoms, etc. The case unfolds and students use clinical reasoning, propose relevant tests, develop a differential diagnoses and a plan of treatment. Sample cases: The Case of the Crying Baby: Surgical vs. Medical Management ; The Plan: Ethics and Physician Assisted Suicide ; The Haemophilus Vaccine: A Victory for Immunologic Engineering
Public Health – A case study describes a pandemic of a deadly infectious disease. Students work through the case to identify Patient Zero, the person who was the first to spread the disease, and how that individual became infected. Sample cases: The Protective Parent ; The Elusive Tuberculosis Case: The CDC and Andrew Speaker ; Credible Voice: WHO-Beijing and the SARS Crisis
Law – A case study presents a legal dilemma for which students use problem solving to decide the best way to advise and defend a client. Students are presented information that changes during the case. Sample cases: Mortgage Crisis Call (abstract) ; The Case of the Unpaid Interns (abstract) ; Police-Community Dialogue (abstract)
Business – Students work on a case study that presents the history of a business success or failure. They apply business principles learned in the classroom and assess why the venture was successful or not. Sample cases: SELCO-Determining a path forward ; Project Masiluleke: Texting and Testing to Fight HIV/AIDS in South Africa ; Mayo Clinic: Design Thinking in Healthcare
Humanities - Students consider a case that presents a theater facing financial and management difficulties. They apply business and theater principles learned in the classroom to the case, working together to create solutions for the theater. Sample cases: David Geffen School of Drama
Finding and Writing Cases
Consider utilizing or adapting open access cases - The availability of open resources and databases containing cases that instructors can download makes this approach even more accessible in the classroom. Two examples of open databases are the Case Center on Public Leadership and Harvard Kennedy School (HKS) Case Program , which focus on government, leadership and public policy case studies.
Implementing Cases
Take baby steps if new to CBL - While entire courses and curricula may involve case-based learning, instructors who desire to implement on a smaller-scale can integrate a single case into their class, and increase the number of cases utilized over time as desired.
Use cases in classes that are small, medium or large - Cases can be scaled to any course size. In large classes with stadium seating, students can work with peers nearby, while in small classes with more flexible seating arrangements, teams can move their chairs closer together. CBL can introduce more noise (and energy) in the classroom to which an instructor often quickly becomes accustomed. Further, students can be asked to work on cases outside of class, and wrap up discussion during the next class meeting.
Encourage collaborative work - Cases present an opportunity for students to work together to solve cases which the historical literature supports as beneficial to student learning (Bruffee, 1993). Allow students to work in groups to answer case questions.
Form diverse teams as feasible - When students work within diverse teams they can be exposed to a variety of perspectives that can help them solve the case. Depending on the context of the course, priorities, and the background information gathered about the students enrolled in the class, instructors may choose to organize student groups to allow for diversity in factors such as current course grades, gender, race/ethnicity, personality, among other items.
Use stable teams as appropriate - If CBL is a large component of the course, a research-supported practice is to keep teams together long enough to go through the stages of group development: forming, storming, norming, performing and adjourning (Tuckman, 1965).
Walk around to guide groups - In CBL instructors serve as facilitators of student learning. Walking around allows the instructor to monitor student progress as well as identify and support any groups that may be struggling. Teaching assistants can also play a valuable role in supporting groups.
Interrupt strategically - Only every so often, for conversation in large group discussion of the case, especially when students appear confused on key concepts. An effective practice to help students meet case learning goals is to guide them as a whole group when the class is ready. This may include selecting a few student groups to present answers to discussion questions to the entire class, asking the class a question relevant to the case using polling software, and/or performing a mini-lesson on an area that appears to be confusing among students.
Assess student learning in multiple ways - Students can be assessed informally by asking groups to report back answers to various case questions. This practice also helps students stay on task, and keeps them accountable. Cases can also be included on exams using related scenarios where students are asked to apply their knowledge.
Barrows HS. (1996). Problem-based learning in medicine and beyond: a brief overview. New Directions for Teaching and Learning, 68, 3-12.
Bonney KM. (2015). Case Study Teaching Method Improves Student Performance and Perceptions of Learning Gains. Journal of Microbiology and Biology Education, 16(1): 21-28.
Breslin M, Buchanan, R. (2008) On the Case Study Method of Research and Teaching in Design. Design Issues, 24(1), 36-40.
Bruffee KS. (1993). Collaborative learning: Higher education, interdependence, and authority of knowledge. Johns Hopkins University Press, Baltimore, MD.
Herreid CF. (2013). Start with a Story: The Case Study Method of Teaching College Science, edited by Clyde Freeman Herreid. Originally published in 2006 by the National Science Teachers Association (NSTA); reprinted by the National Center for Case Study Teaching in Science (NCCSTS) in 2013.
Herreid CH. (1994). Case studies in science: A novel method of science education. Journal of Research in Science Teaching, 23(4), 221–229.
Jonassen DH and Hernandez-Serrano J. (2002). Case-based reasoning and instructional design: Using stories to support problem solving. Educational Technology, Research and Development, 50(2), 65-77.
Kosfeld M, Heinrichs M, Zak PJ, Fischbacher U, Fehr E. (2005). Oxytocin increases trust in humans. Nature, 435, 673-676.
Krain M. (2016) Putting the learning in case learning? The effects of case-based approaches on student knowledge, attitudes, and engagement. Journal on Excellence in College Teaching, 27(2), 131-153.
Lee V. (2012). What is Inquiry-Guided Learning? New Directions for Learning, 129:5-14.
Nkhoma M, Sriratanaviriyakul N. (2017). Using case method to enrich students’ learning outcomes. Active Learning in Higher Education, 18(1):37-50.
Srinivasan et al. (2007). Comparing problem-based learning with case-based learning: Effects of a major curricular shift at two institutions. Academic Medicine, 82(1): 74-82.
Thistlethwaite JE et al. (2012). The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Medical Teacher, 34, e421-e444.
Tuckman B. (1965). Development sequence in small groups. Psychological Bulletin, 63(6), 384-99.
Williams B. (2005). Case-based learning - a review of the literature: is there scope for this educational paradigm in prehospital education? Emerg Med, 22, 577-581.
Zak, PJ (2013). How Stories Change the Brain. Retrieved from: https://greatergood.berkeley.edu/article/item/how_stories_change_brain
The Instructional Enhancement Fund (IEF) awards grants of up to $500 to support the timely integration of new learning activities into an existing undergraduate or graduate course. All Yale instructors of record, including tenured and tenure-track faculty, clinical instructional faculty, lecturers, lectors, and part-time acting instructors (PTAIs), are eligible to apply. Award decisions are typically provided within two weeks to help instructors implement ideas for the current semester.
The Poorvu Center for Teaching and Learning routinely supports members of the Yale community with individual instructional consultations and classroom observations.
The Poorvu Center for Teaching and Learning partners with departments and groups on-campus throughout the year to share its space. Please review the reservation form and submit a request.
IMAGES
VIDEO
COMMENTS
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts. Use this template to complete the assignment.
2 Case Study: Student Technology. After reading the case study on p. 412 (10th ed) or p (11th ed), candidates will answer the questions on p. 413 (10th ed) or p. 405 (11th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts.
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question ...
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question ...
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts. Use this template to complete the assignment.
leading school improvement 2.1 Case Study: Student Technology Tayanna Rias After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts.
In this paper the process used for developing case studies is described in Section 2, a fully developed case study in the domain of software testing is presented in Section 3, pedagogy and educational outcomes are discussed in Section 4, an example of the implementation of software testing case studies is given in Section 5, and student
Instruction, Student Engagement, and Learning Outcomes: A Case Study Using Anonymous Social Media in a Face-to-Face Classroom May 2020 IEEE Transactions on Learning Technologies PP(99):1-1
This study explores schools' digital maturity self-evaluation reports' data from Estonia. Based on quantitative data (N = 499) the schools that attempt digital transformation were clustered into three successive digital improvement types. The paper describes 3 main patterns of school improvement in different phases of innovative change: classroom innovation practices' driven schools ...
The case school chosen for this study is a typical government-aided primary school situated in the Northern District of HKSAR. The school CCLMS was founded in 2002, with students coming mostly from the neighbouring residential areas. CCLMS is among those schools which have a vision of empowering students to embrace new technology in their learning.
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question ...
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts. Use this template to complete the assignment.
6. Same as #4 and #5 above, but after class discussions we asked students to write a case-study report (as a team) instead of responding to the questions assigned earlier. 7. The group case study approach discussed in this paper Table 1: Case-Study Analysis Approaches - Our Attempts Journal of Information Systems Education, Vol. 25(3) Fall 2014 182
The Effect of Digital Device Usage on Student Academic Performance: A Case Study. March 2021; Education Sciences 11(3):121; DOI:10.3390 ... in terms of the use of technology in a lecture theatre ...
The survey was designed to elicit details as to what digital technologies students used in their studies and their experiences of technology enable learning. Cronbach's alpha reliability was high (0.812). The questionnaires were distributed via email to all undergraduate students in both universities - Israel and Australia.
This study aims to determine the impact of using personal digital devices (PDDs) on university students' academic performance. A survey-based questionnaire was developed and used to collect data from 240 Masters students of final semester (16-years of education) enrolled in the Faculty of Economics and Management Sciences, University of the Punjab, Lahore selected by proportional stratified ...
LEADING SCHOOL IMPROVEMENT 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question ...
This paper builds upon two main methodologies: (1) analysis of the case study presented in Sect. 2; and (2) a systematic literature review guided by the research questions raised on the basis of the case study.Figure 1 visualizes the connections between these two methods. Findings from the systematic literature review and case study will be combined in a joint analysis and discussion ...
Abstract. The impact of the COVID‐19 pandemic on regular life across the world has changed teenagers' use of technology. Through this case study, the author analyzed, reported, and reflected on personal experience of a shift in technology use during the COVID‐19 pandemic. There are two parts of this case study.
There is a large body of studies about how digital technology has been implemented in education; e.g. what resources ... 5.2: 1.517: Students' fieldwork: 3.4.892: 4.7: 2.153: 4.9: 1.697: Virtual laboratory work and simulations: ... Lakkala, M. & Ilomäki, L. (2015). A case study of developing ICT-supported pedagogy through a collegial ...
2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions. Responses should thoroughly answer the question prompts.
Case-Based Learning. Case-based learning (CBL) is an established approach used across disciplines where students apply their knowledge to real-world scenarios, promoting higher levels of cognition (see Bloom's Taxonomy ). In CBL classrooms, students typically work in groups on case studies, stories involving one or more characters and/or ...
Brent Robison- Module 2 Leading School Improvement 2.1 Case Study: Student Technology After reading the case study on p. 412 (10 th ed) or p.404 (11 th ed), candidates will answer the questions on p. 413 (10 th ed) or p. 405 (11 th ed) under Analyze the Case and Discuss the Larger Issues. That is a total of 7 questions.