European Journal of STEM Education
Research Article
2019, 4(1), Article No: 09

Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education

Published in Volume 4 Issue 1: 16 Nov 2019
Download: 1042
View: 2717


In this study, student teachers explored four data collection methods for data feedback to improve their teaching skills in Science and Technology [S&T]. The aim was to verify whether these methods were suitable for collecting data concerning their teaching skills during S&T activities in their internships. They analysed the collected data and drew conclusions about the quality and possible improvements of their teaching skills. Logbooks and focus group interviews were used to collect data regarding the suitability of the utilised data collection methods. The findings indicate that the questionnaires, interviews, and observations provided suitable data in some cases; however, this strongly depended on how the student teachers applied these data collection methods. The results provide insights into the problems student teachers encounter in collecting, analysing and interpreting data and how they can be supported therein. Furthermore, it appeared that student teachers need to possess sufficient knowledge and skills to utilise data collection methods, so that specific training in this field is required.


Many primary school teachers in Europe, Australia and North America find it difficult to teach Science and Technology [S&T] (Appelton, 2006; Gillies and Nichols, 2014). They only teach S&T occasionally, which consequently can result in their pupils being less interested in S&T (Van Aalderen-Smeets, Walma van der Molen, and Asma, 2012). This is partly due to inadequate training during their teacher education (Avery, 2012). Alake-Tuenter (2014) urgently advises primary teacher education institutes to provide sufficient opportunities for their students to develop their pedagogical skills regarding S&T education and to support and mentor them during practice.

Several studies have shown that in-service teachers’ S&T teaching skills improve when they repeatedly experiment with S&T teaching theory in practice and discuss their experiences in follow-up workshops (Gillies and Nichols, 2014; Smith, 2013). These studies also stress the importance of teachers obtaining ownership of their professional development during this learning process.

Teachers acquired this ownership when they had the opportunity to practice new teaching approaches and made decisions based on their own analyses of the problems they encountered in practice (Butler and Schnellert, 2012). To be able to correctly and systematically analyse their practice, teachers needed to build on reliable data that is representative of their teaching. Such data enabled them to make informed decisions and improve their teaching skills.

In this article, systematically collecting data for development of teachers’ pedagogical skills is referred to as data feedback. Data feedback can increase the quality of teachers’ decisions and stimulate continuous improvement (Fullan, 2001; Van Veen, Zwart, and Meirink, 2012). Van den Hurk, Houtveen, Van de Grift, and Cras (2014) tested data feedback with student teachers. Their findings show that student teachers improved their pedagogical skills by using researchers’ feedback based on observations of their instruction quality. As a result, the student teachers were more successful in increasing their pupils’ self-confidence and encouraging them to spend more time on their tasks. However, the data that drove the data feedback process in this study were collected by researchers. Studies in which student teachers themselves are responsible for collecting and analysing data for data feedback are scarce (Schnellert, Butler, and Higginson, 2008).

This study aims to identify which data collection methods generate suitable data for data feedback to aid the development of student teachers’ pedagogical skills in stimulating pupils’ attitude towards S&T. The specific focus on attitude is significant since pupils with a positive attitude towards S&T tended to be more engaged in and make more considerate choices regarding S&T topics as adults (Osborne, Simon, and Collins, 2003). Pupils with a positive attitude towards S&T did not only enjoy S&T activities, but were also eager to understand S&T subjects and to carry out S&T activities to satisfy their curiosity (Hillman, Zeeman, Tilburg, and List, 2016). Although studies that use data feedback to improve teachers’ S&T teaching skills have been carried out before (Gerard, Spitulnik, and Linn, 2010; Smith, 2013), none of these studies had a specific focus on teaching skills in stimulating pupils’ attitudes towards S&T. In this study, four different data collection methods were designed and tested by student teachers. This was a considerable challenge for them, as they had limited experience in collecting and analysing data and students were inexperienced in looking at, and learning from, classroom data. A data collection method is only suitable for data feedback when it enables student teachers to collect data that enables them to improve their teaching skills.

The main question of this study is: which data collection methods generate suitable data for data feedback to aid the development of student teachers’ pedagogical skills in stimulating primary school pupils’ positive attitude towards S&T? The related sub-questions are:

(1) how do student teachers use the data collection methods, (2) which strengths and weaknesses do student teachers experience while using the data collection methods, and (3) to what extent does analysing and interpreting the collected data aid student teachers in improving their pedagogical skills?

This study creates insight into how student teachers can develop their pedagogical skills to stimulate their pupils’ attitude in a systematic manner. The results may contribute to providing student teachers with tools to allow them to assess their own development and to continue their professional development independently after completing teacher education.


Pedagogical Skills

Our modern society is increasingly influenced by S&T knowledge. As such it is important that future generations understand how they can use and build on this knowledge. Children [aged 10 -11] with a positive attitude towards S&T, seem to show more interest in S&T and select an S&T related profession as adults more often (Osborne, Simon, & Collins, 2003). Hence, primary teachers need to be aware of the fact that stimulating their pupils’ attitude towards S&T needs explicit attention (Van Aalderen-Smeets et al., 2012). To this end, teacher education institutes need to prepare student teachers for this task by ensuring that they (1) master S&T knowledge, (2) have a positive attitude towards S&T and S&T teaching, (3) and possess sufficient pedagogical skills for S&T education. Alake-Tuenter (2014) showed, however, that Dutch teacher education institutes insufficiently meet these three objectives. Our study focuses on the third objective, the development of student teachers’ pedagogical skills.

Several studies have shown that pupils’ attitude towards S&T becomes more positive when S&T is taught according to the principles of Inquiry Based Science Education [IBSE] (Murphy, Murphy, and Kilfeather, 2011). IBSE is an approach to teaching and learning in which the key principle is that pupils develop their ideas about science and engineering topics by means of experimentation and interpretation of inquiry driven results. It is also important that pupils discuss their findings with each other and reflect on their inquiry process (Driver, Asoko, and Leach, 1994).

Often, IBSE activities are structured using the 5E Instruction Model, which is an internationally recognized instruction model comprehensively described by Bybee et al. (2006). This model includes the following phases: engage, explore, explain, elaborate, and evaluate. For example: in a 5E structured lesson on electrical circuits, the teacher takes apart a battery-operated bedside lamp and asks questions about the materials encountered during that process in the engage phase. In the exploration phase, pupils explore how the battery, wires and a bulb work and they try to light the bulb in small groups. In the explanation phase, the pupils discuss what they discovered during the exploration phase with the teacher. In the elaboration phase, pupils can investigate one of the questions that emerged during the explanation phase by conducting an experiment on electricity or by designing a product in which an electrical circuit is built. In the evaluation phase, the pupils present and discuss their findings and/or products.

To enable student teachers to stimulate pupils’ attitude towards S&T through IBSE and the 5E Instruction Model, they need to acquire specific pedagogical skills. This study distinguishes two types of pedagogical skills, each having its own characteristics: pedagogical skills to support pupils’ cognitive needs and pedagogical skills to support pupils’ social needs (Driver et al., 1994; Hodgson and Pyle, 2010). Cognitive pedagogical skills are employed to foster pupils’ S&T understanding during the 5E Instruction Model phases. Examples of these pedagogical skills are stimulating pupils’ thinking by asking questions, and letting them experiment, observe, and reason (Bybee et al., 2006). The cognitive pedagogical skills intend to stimulate pupils towards a positive attitude for S&T by encouraging their interest in S&T subject matter and activities. Social pedagogical skills are employed to foster pupil wellbeing and enable pupils to express themselves during the phases of the 5E model (Driver et al., 1994). Although social pedagogical skills have a broader application beyond IBSE, this study addresses the social pedagogical skills within the context of the 5E model. Examples of these pedagogical skills are stimulating self-confidence to do ‘hands-on’ activities and involving all pupils in discussions (Hodgson and Pyle, 2010). The social pedagogical skills intend to stimulate pupils towards a positive attitude for S&T by encouraging pupils’ confidence to address the S&T subject matter and activities according to their own preferences, and thus acquire affinity for S&T.

Table 1 shows which pedagogical skills are required for each phase of the 5E Instruction Model. The skills mentioned in Table 1 are derived and adapted from teacher instructions on how to apply the 5E Instruction Model listed by Bybee et al. (2006: 34) and from Goldston, Dantzler, Day, and Webb (2013), who developed an instrument to measure the quality of 5E modelled lesson plans.


Table 1. Pedagogical skills required for stimulating pupils’ attitude towards S&T, per phase of the 5E instruction model

Phase in 5E-instruction model

Cognitive pedagogical skills:

Social pedagogical skills:



collecting materials linked to pupils’ everyday context

eliciting students’ prior knowledge (related to the lesson objectives)

raising pupils’ interest/motivation to learn

creating a social atmosphere in which pupils can express themselves freely


providing suitable and challenging materials

presenting instructions clearly

asking questions that evoke pupils’ ideas and stimulate pupils to explore materials

encouraging pupils to bring questions forward

stimulating self-confidence to do ‘hands on’ activities


asking questions that lead to development of concepts and skills (drawing upon the Explore activities or data collected during the Explore activities)

leading an interactive discussion driven by divergent and convergent questions

involving all pupils into the discussion



providing sufficient and appropriate materials to enable pupils to conduct their experiment/design

enabling pupils to test their concepts by means of their experiments or designs

coaching and stimulating small-group discussions about the experiment/design

making a deliberate group-distribution considering differences between pupils

facilitating shared ownership within the small groups

facilitating pupils’ collaboration


determining beforehand which kind of presentation is most suitable

enabling pupils to evaluate their own experiment/design

making pupils feel proud of their experiment/design


During all phases

making connections with everyday contexts

motivating pupils for S&T subjects

creating situations in which justice is done to pupils’ diversity


Data Feedback

In this study, data feedback was used to aid the development of student teachers’ cognitive and social pedagogical skills in stimulating pupils’ positive attitude towards S&T. Data feedback usually aims to improve teachers’ instructional skills to foster pupils’ cognitive development, and this is common practice in many schools in Europe and North America already (Carlson, Borman, and Robinson, 2011; Van Geel, Keuning, Visscher, and Fox, 2016). The data used for data feedback in those schools are, for example, assessment results and student questionnaire data (e.g. pupils’ perceptions of the quality of their teachers’ instruction; Schildkamp and Kuiper, 2010). To structure data feedback for the improvement of teachers’ instructional skills an instructional change cycle was developed (Schnellert et al., 2008). This cycle contained the following steps: setting instructional goals, designing new instructional strategies, conducting new instructional practices, collecting and analysing data, and deciding which further actions to take. The instructional change cycle is intended to be carried out in small groups of collaborating teachers who run through a process of self-regulated learning by thoroughly looking into their own classroom data without interference by researchers. Although it is known that teachers’ pedagogical skills improve when using this method (Butler and Schnellert, 2012), it is not common practice in primary teacher education institutes.

As our study focused on student teachers’ skills related to stimulating pupils’ attitude, the above-mentioned instructional change cycle was adapted: the first and second step of the instructional change cycle were merged to stress that the pedagogical goal and the designed IBSE activity form a single unit. Another change was the addition of a new step, namely, designing a data collection method [step 2]. This new step was necessary as student teachers in this study select a data collection method which they adapt or redesign in such a way as to be able to collect data reflecting the development of the specific pedagogical skills - aimed at stimulating pupils’ attitude towards S&T. The step of ‘collecting data’ was relocated from the fourth step [instructional change cycle] to the third step as this occurred either simultaneously or immediately after the IBSE activity. Consequently, the fourth step consisted of analysing the collected data. Finally, the fifth step included a conclusion of which further steps to take and a self-reflection regarding student teachers’ development of their pedagogical skills during the completion of the data feedback cycle (evaluation). The data feedback cycle as displayed in Figure 1 was used in this study.


Figure 1. Data feedback cycle for development of IBSE teaching skills



Context and Participants

The study was carried out at an institute for primary teacher education in the Netherlands. The participants (n=19) were student teachers in the penultimate year of their bachelor’s study who attended a semester programme in which IBSE was the main theme. All participants were between 19 and 23 years old and had more than two years of internship teaching experience.

The participating student teachers completed two data feedback cycles (Figure 1), each containing a practical part conducted during their internship at primary schools. These data feedback cycles were the central activities during the semester programme. The participants were trained and directed in using the data feedback cycles by the teacher educator. Aside from the data feedback cycles, the semester programme also consisted of several theoretical and applied lectures on IBSE, an excursion to a science museum, and excursions to primary schools where IBSE activities were daily practice.

Data were collected on two different levels in this study. Firstly, by the participating student teachers, who utilised tools to collect data for the use of data feedback to develop their pedagogical skills in stimulating pupils’ positive attitude towards S&T. These tools will be termed as data collection methods from this point onwards. Secondly, data were also collected by the researcher in order to analyse whether, and how, the data collection methods utilised by student teachers generated suitable data for feedback to aid the development of their pedagogical skills in stimulating pupils’ attitudes towards S&T. The tools used by the researcher will be called instruments.

Design and Procedure

The participating student teachers completed the data feedback cycle (Figure 1) twice over a period of 14 weeks. We opted for two data feedback cycles for two reasons. Firstly, to allow participants to practice the data feedback process during the first cycle and apply improvements in the second cycle. Secondly, to identify participants’ ability to apply the pedagogical skills learned in the first cycle, in the second cycle. Each data feedback cycle lasted seven weeks (Table 3).

A data feedback cycle consisted of five steps (Figure 1):

  1. The participants started by determining which pedagogical skills (Table 1) they had not yet fully mastered. Following this, they selected one or two skills and designed an IBSE activity that would provide them the opportunity to develop this skill. It was opted to let the participants choose their own pedagogical skills in order to allow them to have ownership of their development; the skills chosen by each of the participants are listed in Table 2. The pedagogical skills displayed in Table 1 indicate which skills are most characteristic for each 5E model phase. The skills are deliberately non-specific to provide (student) teachers the opportunity to individualise these skills in line with their personal development.

  2. The participants were confronted with four types of data collection methods selected by the researcher: classroom observations, group interviews, pupils’ drawings, and questionnaires. These four methods were introduced in two lectures outlining how to use these methods for collecting valid data. These lectures pertained to classroom research in general and to collecting data about participants’ own pedagogical skills. Following this, each participant chose two out of the four data collection methods and (re)designed the methods to better fit these to the pedagogical skills they strived to master. Once again it was opted to let the participants choose (this time the data collection methods) to strengthen participant ownership; the data collection methods that each of the participants chose are displayed in Table 2. To facilitate participants during the process of (re)designing the chosen method, information from two Dutch textbooks on research methods for doing educational research (Baarda, 2010; Onstenk, Kallenberg, Koster, and Scheepsma, 2011) were available. In addition, participants were encouraged to consult articles or books specifically dealing with pupils’ attitude towards S&T in which sample observation tools (Laevers & Peeters, 1994), interview questions (Fitzgerald, Dawson, and Hackling, 2013), children’s drawings (Murphy, Varley, and Veale, 2012), or questionnaire items were listed. The second step ended by discussing the IBSE activities designed in step 1 and the redesigned data collection methods within the feedback groups. Specific feedback on the designed data collection method was given by the teacher educator and peers during these meetings to allow participants to collect valid data.

  3. The participants conducted the designed IBSE activity in practice and collected data using their redesigned data collection methods.

  4. The participants analysed and interpreted the collected data in order to find out whether and how they had succeeded in improving the pedagogical skills they selected. The collected data, the analyses, and the interpretations were then discussed within the feedback groups. Once again, for each feedback group there was a teacher educator present for consultation, this time focusing on analysing and interpreting the collected data.

  5. The participants completed the data feedback cycle by evaluating the data feedback process and by drawing conclusions concerning their skills’ development. Following this, all participants repeated the data feedback cycle.

In order to create a diverse and thorough impression of how a data collection method could be utilised, each participant piloted two different types of data collection methods and used those methods in both data feedback cycles.


Table 2. The data collection methods and the pedagogical skills that student teachers chose

Student teacher

Chosen data collection methods

Chosen pedagogical skills*


observation, interview

2.2, 4.6


observation, drawing



observation, drawing

4.2, 6.2


observation, questionnaire

2.4, 4.2


observation, questionnaire

3.2, 3.3


observation, questionnaire

2.4, 4.2


interview, drawing

4.2, 6.2


interview, drawing

4.2, 6.2


interview, drawing

3.2, 6.2


interview, questionnaire

4.4, 6.2


interview, questionnaire

1.2, 4.2


interview, questionnaire

4.4, 5.3


interview, questionnaire

2.5, 3.2


interview, questionnaire

4.4, 4.5


interview, questionnaire

2.3, 4.5


interview, questionnaire



Interview, questionnaire

2.5, 4.5


interview, questionnaire

2.5, 4.6


drawing, questionnaire

4.1, 6.2

* The numbers in this column correspond to the numbers of the pedagogical skills in Table 1 [For example 4.2; 4 represents the phase of the 5E model and 2 represents the number of pedagogical skills within that phase]


Table 3. Time frame of participants’ activities during the data feedback cycles


Participants’ activity: (data feedback step between brackets)

1 & 8

selecting pedagogical skill(s) for data feedback (step 1)

designing IBSE activity (step 1)

forming feedback groups of 5 participants each (only in the 1st week)

2 & 9

discussing designed IBSE activity in feedback group (step 1)

selecting and redesigning data collection method (step 2)

3 & 10

discussing redesigned data collection method in feedback group (step 2)

4 & 11

conducting IBSE activity and collecting data (step 3)

5 & 12

analysing and interpreting collected data (step 4)

6 & 13

discussing analyses and interpretations in feedback group (step 5)

7 & 14

reflecting on the data feedback process and concluding whether and if so, how their pedagogical skills can be improved (step 5)


Participants’ activity: (data feedback step between brackets)

1 & 8

selecting pedagogical skill(s) for data feedback (step 1)

designing IBSE activity (step 1)

forming feedback groups of 5 participants each (only in the 1st week)

2 & 9

discussing designed IBSE activity in feedback group (step 1)

selecting and redesigning data collection method (step 2)

3 & 10

discussing redesigned data collection method in feedback group (step 2)

4 & 11

conducting IBSE activity in practice and collecting data (step 3)

5 & 12

analysing and interpreting collected data (step 4)

6 & 13

discussing analyses and interpretations in feedback group (step 5)

7 & 14

reflecting on the data feedback process and concluding whether and if so, how their pedagogical skills can be improved (step 5)



Two instruments, logbooks and focus group interviews, were used to collect data reflecting whether and how the data collection methods participants used generated suitable data for data feedback.


The participants reported on how they had performed and experienced each step of the data feedback cycle in an individual logbook. The logbook was intended to reveal how they collected, analysed, and interpreted data. The logbook was aimed to provide insight into how the participants experienced the particular data collection methods they used (Ketelaar, Koopman, Den Brok, Beijaard, and Boshuizen, 2014).

During or after conducting a data feedback cycle step the participants addressed the following questions in their logbooks:

Step 1: What pedagogical skill(s) do you want to improve? Which S&T-activity are you going to conduct?

Step 2: Which data collection methods are you going to utilise? What data do you intend to collect?

Step 3: Which data were collected and what were your experiences during data collection? What strengths and weaknesses did you encounter in collecting your data?

Step 4: How did you analyse and interpret the collected data? How did your feedback group react to your analyses and interpretations?

Step 5: Did the reflection on the collected data change your view on your own pedagogical skill(s), and if so how? In hindsight, how did you experience working with the utilised data collection methods?

Once the first feedback cycle had been completed, the participants received feedback from the teacher educators on how they utilised the data collection method based on what they reported in their logbook. During the second data feedback cycle, participants answered the same questions in their logbook.

Focus group interviews

After having finished their logbook for the second data feedback cycle, four focus group interviews were conducted with the student teachers (one per data collection method). The aim of these interviews was to gain in-depth insight into how they collected, analysed, and interpreted the data for developing their pedagogical skills (Krueger and Casey, 2015).

The participants’ experiences were examined by interviewing groups of six randomly selected participants that utilised the same data collection method. It was decided to select six participants per interview as this would allow the emergence of different experiences and opinions while still being manageable enough to ensure that all participants had ample opportunity to share their experiences. In each focus group interview the following main questions were addressed: (1) how were data collected, (2) how were data analysed, and (3) did the collected data generate suitable information that could be utilised to improve participants’ pedagogical skills for IBSE? In the interviews, the participants’ experiences and underlying motives for making decisions concerning the application of the method were examined. Once a main question was posed, the participants had the opportunity both to answer the question and to react until no new information was mentioned. The researcher, who was the interviewer, kept the discussions within the boundaries of the subject and also encouraged the discussion without leading the participants to specific opinions (Hilby, Stripling, and Stephens, 2014). Each interview lasted about 60 minutes and was audio-taped and transcribed verbatim.

Data Analysis

The data gathered by means of the logbooks and the focus group interviews were analysed in two phases. In the first phase, all data were thoroughly read and divided into fragments. The resulting fragments were then labelled using ATLAS.ti 7.5; a total of 234 fragments were labelled. Some labels addressed which data collection method the fragment pertained to. Some of the fragments addressed a weakness or strength of a data collection method. In order to gain insight into the suitability of a method, these fragments were labelled as such. An example fragment from a focus group interview is the following phrase: “I asked the pupils to write down what their drawing represented; if I had only looked at the drawings at home, I would not have known what they represented”. This fragment regarding the data collection method drawings, indicates how the participant deployed this method and that the participant was unable to interpret the drawing without a written explanation. Thus, this fragment has been labelled as ‘drawings’ and as ‘weakness’. In addition to labels regarding methods and strengths and weaknesses, labels were assigned to fragments indicating participants’ actions to improve the pedagogical skills they chose and their reflection on that actions. The following fragment from one of the logbooks illustrates how this part of the labelling was performed. “On the recording I saw that, during the discussion, the pupils were distracted by all the materials in front of them. Next time I will hand out the materials after I have finished the discussion to allow the pupils to have a better focus on it”. This fragment has been labelled as ‘observation’ [given that this was the data collection method used] and as ‘involving all pupils into the discussion’ [5E model skill 3.2, given that this was the pedagogical skill the participant aimed at].

During the second phase, the labelled fragments were placed in a matrix. Per data collection method, labelled fragments were linked to the following topics: (1) how was the data collection method performed by the participants, (2) the strengths and weaknesses of the data collection methods participants experienced, (3) whether the collected data reflected how participants mastered the pedagogical skills aimed at and to what extent analysing and interpreting the collected data aided participants to improve these skills.

An audit procedure was conducted by another researcher to check the transparency and accuracy of the labelling process, as well as the justifiability and acceptability of the analyses. This audit was carried out according to the stages presented by Akkerman, Admiraal, Brekelmans, and Oost (2008). The auditor considered the assignment of the labels to the fragments to be justifiable. The manner in which the data were analysed and matrixed was found to be transparent and consistent with the raw data. The audit report can be retrieved by contacting the first author.


In this section the results will be displayed per data collection method.

Data Collection Method: Observation

The six participants that chose the data collection method observation utilised two different approaches of observing their pedagogical skills: (1) with the aid of an observation instrument called the Leuven involvement scale (used by 2 participants) and (2) with the aid of (partly) self-designed observation forms (used by 4 participants). The results of those two approaches are displayed separately below as they differ in methodology and findings. Each of these observation methods generated data pertaining to a specific pedagogical skill.

Performance of the data collection method - Leuven involvement scale

The Leuven involvement scale (LBS-L; Laevers and Peeters, 1994) is a widely used classroom observation instrument enabling observers to determine the extent of pupils’ (7-12 years old) involvement in the classroom; LBS-L was one of the sample observation tools in the semester programme. With LBS-L, the perceived behaviour of pupils can be graded on a scale of 1-5 (1= low involvement, 5= high involvement), and for scores 4 and 5 it can be ascertained whether pupils are sham involved (i.e. pupils show external signs of involvement such as wobbling on the chair or biting nails) or truly involved. Sham involvement is indicated by a [ ‘ ] behind the score. The participants who collected their data with the aid of the LBS-L video-recorded their IBSE activity. Based on this video-recording, participants determined their pupils’ involvement using the LBS-L score list. Following this, they observed the video recordings again, this time to understand the relation between their pedagogical skills and the perceived pupil behaviour.

Strengths and weaknesses - Leuven involvement scale

The participants indicated that LBS-L is only suitable for collecting data regarding the pedagogical skill ‘involving all pupils into the educational discussion’ [3.2]. Furthermore, they mentioned that LBS-L requires rather detailed footage which they considered as a weakness as video recording classroom situations was not permitted in all schools.

Relation between data and pedagogical skills - Leuven involvement scale

Both participants were able to collect data which reflected how they mastered the pedagogical skill ‘involving all pupils into the discussion’. In her logbook, Miranda reported having monitored a pupil involvement of 3 [LBS-L scale, moderate score] and added: “The pupils looked around and were distracted by all the materials in front of them during the educational discussion”. Miranda concluded that it would be better to remove the materials before starting the educational discussion. Erin expressed doubts regarding the reliability of LBS-L in some cases: “one of the pupils has ADHD and therefore he wobbled a lot, this had to be marked as sham involvement while he was really involved.”

The participant students, though, did not find that LBS-L aided them in improving their pedagogical skills. Miranda noted that despite the realistic impression of pupils’ involvement provided by the LSB-L, it remained unclear how to improve this. This was confirmed by Erin in her logbook after she monitored a pupil involvement of 3 [LBS-L scale] during the educational discussion: “I still have no idea how I can improve my pupils’ involvement.” In order to improve their pedagogical skills adequately, participants needed to know the motives behind pupils’ perceived behaviour. Miranda indicated pupils’ motives were more visible on video during the Elaborate phase because pupils express themselves more during this phase.

Performance of the data collection method - [partly] Self-designed observation form

Participants who were not allowed to video their pupils because of the primary school’s privacy rules, used an observation form that was completed by their supervising in-service teacher [mentor] while the participant conducted the IBSE activity. Three of those participants designed their own observation form. They formulated questions that focused on pupils’ behaviour, such as: is the pupil actively involved in the activity and how do pupils collaborate? The mentor had the opportunity to score these questions using a scale of 1-5 [1= low involvement/little collaboration, 5= high involvement/high degree of collaboration]. The fourth participant, Margaret, utilised an existing [and tested] observation form during her first IBSE activity (SLO, 2017), on which her mentor noted observations regarding her activities during each phase of the 5E Model. However, the data collected with the guidelines on that observation form did not reflect whether Margaret mastered her pedagogical skills, since it focused on her activities and not on the pupils’ response to Margaret’s activities. The form contained questions such as: how does the teacher deal with individual differences between pupils and how does the teacher challenge pupils? These questions prompted Margaret’s mentor to describe only her actions and not her pupils’ reaction even though Margaret also needed such data. In the second data feedback cycle she adapted the observation form to gather information on the pupils’ reactions and informed her mentor to focus primarily on the impact of her actions on the pupils’ behaviour.

Strengths and weaknesses - [partly] Self-designed observation form

The only strength mentioned by the participants was that the observation forms created opportunities for the mentors to encourage student teachers and to provide practical advice. Margaret noted “My mentor’s remarks on the observation form encouraged me to continue.” The participants mentioned two weaknesses. Firstly, only a few pupils could be monitored by the mentor simultaneously resulting in a limited amount of obtained data. Secondly, the quality of the feedback depended on the mentors’ expertise. The latter was expressed by participant Cynthia: “I depend on what the mentor wrote down. If I had had footage, I could have checked whether it was true and what the mentor intended.”

Relation between data and pedagogical skills - [partly] Self-designed observation form

The participants who developed their own observation form experienced difficulties in formulating clear observable actions which could be used to improve their skills. Margaret, however, demonstrated how the collected data aided her in improving the pedagogical skills that enabled pupils to test their concepts by conducting their own experiment or design [4.2]. During the IBSE-activity, her mentor had recorded Margaret’s actions as well as her pupils’ responses to those actions on the observation she redesigned. This is illustrated by Margaret in her logbook:

My mentor wrote on the observation form: “A group of pupils wanted to test their designed self-running toy car in the test zone. It did not run well at all. Margaret observed the situation quietly and when the children looked at her she said: ‘Maybe weight has something to do with it...’ The children look at each other and said: ‘Of course.... we made it far too heavy...’ and went on redesigning.” This was what I wanted to achieve. To enable pupils to test their concepts during their own design process by coaching.

Through the confirmation of having mastered the chosen pedagogical skill, Margaret knew she could strive to develop other pedagogical skills she did not yet master. In conclusion, observations [with the aid of LBS-L or other observation forms] are suitable for data feedback provided that the student teachers’ actions and pupils’ reactions are both monitored. However, this data collection method provided only limited aid in improving the pedagogical skills because the underlying motives of pupils’ behaviour remain undetected.

Data Collection Method: Interview

Performance of the data collection method

The data collection method interview was performed by 13 participants who all conducted semi-structured group interviews with groups of 2 to 4 pupils after the IBSE activity. The interview questions were prepared in advance and were aimed at specific pedagogic skills selected by the participant. For example, participant Jack intended to collect data regarding the pedagogical skill ‘stimulating self-confidence to do hands-on activities’ [2.5]. Thus, Jack prepared the question “did you think it was difficult to build the plane yourself?” to verify how confident the pupils were during the hands-on activity. The interviews were recorded, transcribed and analyzed by the participants.

Strengths and weaknesses

The strength of the interview method appeared to be its suitability for collecting data to aid the improvement of the participants’ pedagogical skills since these data directly show how the pupils experience their actions. This was illustrated by participant Isabelle in her logbook in the following interview fragment with two pupils Luke and Michael. “Isabelle: ‘Can you tell me, what did you think of the lesson?’ Michael: ‘I liked it, but..’ Isabelle: ‘Yes..’ Michael: ‘How shall I put it? We didn’t have enough time, we had to build too fast.’ Luke: ‘..and we had a lot more to make.’” At the same time, participants experienced the following weaknesses: [a] finding a suitable time and place to conduct the interviews was sometimes difficult, [b] pupils sometimes felt uncomfortable when being interviewed, and [c] the participants struggled to formulate adequate follow-up questions. Claudia explained during the focus group interview “I expected the pupils to give better answers” and she did not know whether this was due to her questions or because the pupils felt uncomfortable with the situation. Barry remarked that an interview quickly becomes a formal affair resulting in pupils giving short factual answers.

Relation between data and pedagogical skills

Four participants succeeded in collecting data reflecting how they mastered the chosen pedagogical skill and all these participants collected data on how they mastered the pedagogical skill to enable pupils to test their concepts by conducting their own experiment or design [4.2]. Participants Tobias and Isabelle illustrated how the data they collected aided them to improve this pedagogical skill in their logbooks.

After the first IBSE activity, Tobias noted:

When I asked the pupils what they thought of the IBSE activity they reacted as follows: ‘We are not used to thinking about the first step ourselves. The instruction was only: make something that moves with cogs, the rest we had to come up with by ourselves. This was too difficult for us.’

In response Tobias showed an open and accessible attitude by saying: “Good of you to tell me this” and then asked the in-depth question: “what would have helped you?” His pupils then explained that they needed clear instructions during the elaborate phase. Tobias concluded he mastered the chosen pedagogical skill insufficiently since he had provided insufficient support to enable his pupils to test their concepts. For this reason, he sought to provide this support during his follow-up IBSE activity. In the interview that followed, his pupils reported: “the goal and instructions were clear. You explained what the point was. This made trying it out more fun. We wanted to really try it out ourselves.” Tobias concluded he had to consider carefully in advance which amount of self-regulation would stimulate the pupils during the elaborate phase.

Isabelle described that she had her pupils design a toy car that had to ride down a slope on completion. The point was to let it drive as far as possible from the slope. After interviewing her pupils, Isabelle noted:

At first the pupils were not reacting positively, they clearly indicated that they did not have enough time to test and adjust their designed car. As the interview progressed, the pupils became increasingly positive and I found out why the interview had started negative. The activity was fully in line with their interest and therefore they wanted more time to adjust their design.

Isabelle had asked her pupils the in-depth question: “you needed more time to improve your car, but wat did you want to do?” In response her pupils told her in detail how they wanted to improve their car. This response aided Isabelle in understanding that her pupils would not only have been more motivated but would also have learned more if they had been given more time.

Most participants [9], however, failed to collect data reflecting how they mastered their chosen pedagogical skill as they did not ask in-depth questions.

In conclusion, student teachers’ ability to collect data specific enough to indicate how they mastered their targeted pedagogical skill was largely dependent on their open and accessible attitude during the interviews and their persistence in asking in-depth questions in order to find helpful clues. It seemed essential to encourage pupils to express themselves and to regularly summarize what pupils said in order to avoid miscommunication. In contrast to the other students, Isabelle and Tobias were able to do so and considered their interview data highly suitable.

Data Collection Method: Drawings

Performance of the data collection method

All six participants that performed the data collection method drawings, chose to improve their pedagogical skill ‘motivating pupils for S&T subjects’ [6.2]. Thus, they had asked their pupils to make pre and post lesson drawings of themselves during an IBSE activity showing how motivated they were to conduct IBSE activities. Next, the participants compared the pre and post lesson drawings and drew conclusions. Four of the participants asked their pupils to write down what they had drawn, anticipating that some of the drawings might not be easily interpreted. This was illustrated in the focus group interview by participant Ellie:

I asked the pupils to write down what their drawing represented; if I had only looked at the drawings at home, I would not have known what they represented.

Ellie had her pupils draw their faces showing their emotions regarding the activity that caused the drawn emotion (Figure 2). The other participants had not explicitly instructed their students to draw their emotion in a separate drawing. However, they indicated that the emotion in their pupils’ drawing was often clearly visible.

Strengths and weaknesses

The strengths of the method drawings appeared to be: pupils enjoyed making drawings, it required little preparation on the part of the participants and many of the drawing provided a good impression of the pupils’ motivation for the IBSE-activity. The latter was illustrated by Ellie who had given her pupils a basin partly filled with water and challenged them to build a dike that holds all the water to one half of the basin. Ellie discussed this in the focus group interview, stating:

Things really did not go well at some points and the dikes collapsed quite often. Still, the drawings showed me that they were having fun during this activity [as in Figure 2, many of her pupils drew drawings depicting happy faces]. Then, I think, I have done a good job.

On the other hand, the main weakness mentioned by the participants is that the drawings were often unclear and not specific enough for the participants to know how to improve their pedagogical skills.


Figure 2. One pupils’ drawings showing emotion during the IBSE activity [Note: The statement above the drawing can be translated as “very much fun”]


Relation between data and pedagogical skills

The drawings reflected how the pupils had experienced the IBSE activity emotionally, which allowed the participant to have a rough understanding of how they mastered the pedagogical skills. However, these data were insufficiently specific to aid in improving their skills. In the focus group interview, participant Laura reported the following on her pupils’ drawings “they drew exactly what they had done in class, I had no information regarding my actions.” On the other hand, Lucy mentioned she had succeeded in understanding how the pupils had experienced the IBSE activity emotionally which allowed her to know whether she had succeeded in her IBSE activity.

In conclusion, the participants indicated that pupils’ drawing of themselves during the IBSE activity provided only basic information regarding pupils’ attitude; they were unable to improve specific pedagogical skills based on these drawings.

Data Collection Method: Questionnaire

Performance of the data collection method

The data collection method questionnaire was used by 13 participants. Since the participants discussed their questionnaire designs in the feedback groups, they would use each other’s questions whenever possible. The final questionnaires were different as each participant intended to collect data regarding a different pedagogical skill. The length of the questionnaires varied from eight to 15 questions. The type of questions also varied across the questionnaires. Some participants used items combined with a 4/5-point Likert scale while others selected multiple choice questions or ranking questions or a mix of all those types of questions. The questions were thus mainly closed questions. Participants presented the questionnaires to their pupils before and after the IBSE activity. Afterwards, they put the collected data in graphs and compared the pre and post lesson questionnaire scores.

Strengths and weaknesses

In general, the participants regarded the data collection method questionnaire as positive. The reported strengths were that the data could be collected easily and quickly. and analysing the data and drawing conclusions was considered to be relatively convenient. Jack expressed this argument in his logbook as follows: “With this method, I had all the data swiftly collected and organized.” Besides the strengths, the participants indicated the following weaknesses: when (a) there were a high number of questionnaires to fill out during a given time frame, (b) the questionnaires contained too many questions, and (c) the questionnaires were administered at the end of the day, pupils did not fill out the questionnaires carefully.

Relation between data and pedagogical skills

Most participants indicated that the collected data reflected how they mastered the pedagogical skill ‘motivating pupils for S&T subjects’ [6.2] in their logbooks. To that end, they formulated questions such as “would you like to do IBSE activities more often?” and “would you like to learn more about S&T topics?”. In Tobias’ case 15 pupils out of 20 responded positively to the first question prior to his first IBSE activity. After the activity only 10 of his pupils responded positively to the same question. Thus, Tobias concluded that he had to improve his pedagogical skills.

While six participants concluded they mastered their chosen pedagogical skills insufficiently, none of them was able to identify how to improve their skills. Therefore, in the focus group interview, the participants agreed that it is necessary to collect additional data with other methods. Alice made the following point:

In the questionnaire a pupil can indicate ‘I enjoyed the lesson’ but I had no idea why he indicated that, or which of my actions was responsible for that. But you can use the questionnaire during your interview and ask pupils for clarification.

Yet, Tobias noted in his logbook:

I assumed that my pupils would appreciate it when I provided them with ample opportunity for self-regulation [in the first IBSE activity]. Anyway, this did not work. Results from the questionnaire [see above] showed that the pupils did not appreciate this. From the interviews, I learned that the pupils benefit more from structured teaching. Within that structure, they want to have opportunities for self-regulation [see the section Data Collection Method: Interview].

In conclusion, the participants agreed that the data from questionnaires and interviews seemed to complement each other. The questionnaires provided participants with a general idea of the extent to which they mastered the pedagogical skills, and the interviews allowed them to acquire data that enabled them to improve their skills.


Main Findings

Regarding the first research question pertaining to how student teachers used the data collection methods, the findings indicate that participants were insufficiently prepared for designing and using data collection methods since they had significant difficulties in this respect. For example, the participants struggled to design suitable observation forms and to ask follow-up interview questions to obtain useful detailed information. Only a few participants used data collection methods in such a way that suitable data were collected.

The second research question concerns the strengths and weaknesses participants experienced while using the data collection methods. The participants experienced both; the main strengths they encountered were: a) observation through video footage of classroom situations proved to be a suitable method for determining pupil involvement b) observation forms filled in by a mentor sometimes strengthened student teachers’ development. c) interviews provided specific pupil statements which aided the participants to understand how pupils experienced their actions, and d) the questionnaires provided the participants with an easy overview on how their pupils had experienced the IBSE activity. The main weaknesses that emerged were: a) in some schools video recording classroom situations was not permitted, b) finding the right time and place to interview pupils was sometimes difficult, and c) problems with poorly completed questionnaires due to bad timing and having too many questions in the questionnaires. The drawings provided a general impression of pupils’ attitude, however, the participants were unable to find clues for improving their pedagogical skills within such data. Hence, drawings do not seem suitable for data feedback.

The third research question pertains to the extent to which analysing and interpreting the collected data aided student teachers in improving their pedagogical skills. However, due to the participants’ lack of research skills, this study yielded little suitable data on this research question. In some cases, however, the participants collected data that aided them to understand how to improve their skills. Participants using interviews in an effective manner were characterised by having an open and accessible attitude towards their pupils during the interviews and being persistent in asking in-depth questions to unearth helpful clues. Participants who met those two conditions collected suitable data, which enabled them to effectively improve their pedagogical skills. Some participants found that the data collected through questionnaires were suitable as a starting point for the interview questions. Data collected with observations might be used in a similar manner.

Within the semester programme the participants completed two data feedback cycles instead of one in order to enable the participants to apply lessons learned from the first cycle in the second cycle. This functioned for some participants; for example, in the second cycle Margaret and Isabelle made clear improvements to their data-collection method and Tobias to his pedagogical skills.

Implications for Practice

Most participants had collected data that was insufficiently suited to improving a targeted pedagogical skill. The reason for this might be that they did not have a clear personal goal in view, despite this being a requirement for (re)designing a suitable data collection method (Schnellert et al., 2008). The participants started each data feedback cycle by setting their goals and selecting a pedagogical skill they intended to improve (Table 1), but most of them were unable to (re)design a data collection method to collect suitable data that related to both the specific characteristics of the chosen skill and the personal emphasis of a participant in terms of what he/she wanted to learn in particular related to that skill. Margaret was one of the participants with a clear personal goal. She showed an understanding of the specific characteristics of the pedagogical skill she had chosen, and these characteristics were visible in the data she collected. The goal she had from the outset enabled her to collect suitable data during the second data feedback cycle. To allow student teachers to tailor the pedagogical skills [Table 1] to specific personal goals, they are required to have a clear view of their final result: what should the skill look like in my practice once I have mastered it [observations] or what should my pupils say after a lesson if I mastered the skill [questionnaire and interview]? By encouraging student teachers to reflect on such questions in step 1 [Figure 1], they will acquire a more concrete notion on what they want to achieve in their own practice related to both the 5E skills and their personal context. If student teachers consider the chosen pedagogical skill as part of their professional development as prospective teachers it is likely that, like Tobias and Margaret, the preparedness of other student teachers to take ownership of those pedagogical skills’ development will increase.

Another explanation for having only a few participants successfully collect suitable data to improve their pedagogical skills lies in their lack of research skills. Therefore, data feedback step 2 [(re)design a data collection method] and step 3 [collecting data] require extension with detailed instructions and discussions on how to design and perform the relevant data collection methods, illustrated, for example, by good and bad practices. This should enable student teachers to (re)design reliable and valid data collection methods and help them in being sufficiently equipped to carry out these methods.

Limitations and Suggestions for Further Research

This exploratory study somewhat underexposed how student teachers analysed and interpreted their collected data. But, once student teachers have collected valid data, they need to be enabled to analyse and interpret those data in order to take well-informed decisions on how to improve their pedagogical skills. This aspect deserves explicit attention in follow-up studies on data feedback. In general, this study underestimated the complexity of having student teachers collect data that was aimed at helping them improve their pedagogical skills. Nevertheless, the findings showed how student teachers collect and interpret data, which problems they encountered in doing so, and how some of them overcame these problems. While this was only a small-scale study limited to the context of primary teacher education, we expect that these insights allow us to optimise (student) teachers’ education programme in which data feedback has a central role. The key components worth considering for the optimisation of such a programme appear to be: a) encouraging (student) teachers to set specific and personal goals, b) educating (student) teachers in collecting, analysing and interpreting valid data, and c) teacher educators’ enacting a role model in which they explicitly support (student) teachers in collecting, analysing and interpreting data for data feedback. Follow-up research will focus on these three factors in an attempt to improve (student) teachers’ education in teaching S&T.


This study aimed to design and test data collection methods on their suitability to generate data for data feedback to aid the development of student primary teachers’ pedagogical skills for IBSE. Our findings suggested that such suitable data can be collected with three of the four data collection methods we tested, namely observations, interviews, and questionnaires. However, the findings also showed that most of the participating student teachers struggled in (re)designing and performing data collection methods, and thus, were unable to collect suitable data. To improve this, specific and profound curriculum adaptations are required.

Although there are only a few of them, our results illustrate some good practices. These good practices suggest that, given that certain conditions related to setting specific and personal goals and collecting valid data are met, student teachers will be able to use data feedback to develop their S&T teaching skills. These practices indicate that data feedback has the potential to aid student teachers in developing their pedagogical skills in stimulating their pupils’ attitude towards S&T. In view of this, it is worthwhile to further investigate learning how to use data feedback in teacher education.


This research was funded by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek [NWO; The Netherlands Organisation for Scientific Research].

Figure 1 Figure 1. Data feedback cycle for development of IBSE teaching skills
Figure 2 Figure 2. One pupils’ drawings showing emotion during the IBSE activity [Note: The statement above the drawing can be translated as “very much fun”]
  • Akkerman, S., Admiraal, W., Brekelmans, M. and Oost, H. (2008). Auditing Quality of Research in Social Sciences. Quality & Quantity TA, 42(2), 257-274.
  • Alake-Tuenter, E. (2014). Inquiry-based science teaching competence of pre-service primary teachers. Wageningen University, Wageningen.
  • Appelton, K. (2006). Science pedagogical content knowledge and elementary school teachers. In K. Appleton (Ed.), Elementary science teacher education (pp. 31-54). Mahwah, NJ: Lawrence Erlbaum.
  • Avery, L. M. (2012). Teaching science as science Is practiced: Opportunities and limits for enhancing preservice elementary teachers’ Self-Efficacy for science and science teaching. School Science and Mathematics, 112(7), 395-409.
  • Baarda, B. (2010). Research: this is it! Guidelines for setting up, doing and evaluating quantitative and qualitative research (1st ed). Groningen: Noordhoff.
  • Butler, D. L. and Schnellert, L. (2012). Collaborative inquiry in teacher professional development. Teaching and Teacher Education, 28, 1206-1220.
  • Bybee, R., Taylor, J. A., Gardner, A., Scotter, P. V., Powell, J. C., Westbrook, A. and Landes, N. (2006). The BSCS 5E Instructional Model: origins and effectiveness. Colorado Springs, Co.
  • Carlson, D., Borman, G. D. and Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement. Educational Evaluation and Policy Analysis, 33(3), 378-398.
  • Driver, R., Asoko, H. and Leach, J. M. S. P. (1994). The Nature of Scientific Knowledge. Educational Researcher, 23(7), 3.
  • Filderman, M. J. and Toste, J. R. (2018). Decisions, decisions, decisions using data to make instructional decisions for struggling readers data-based decision making. Teaching Exceptional Children, 50(3), 130-140.
  • Fitzgerald, A., Dawson, V. and Hackling, M. (2013). Examining the beliefs and practices of four effective Australian primary science teachers. Research in Science Education, 43(3), 981-1003.
  • Fullan, M. (2001). The meaning of educational change. New York and London: Teacher College Press.
  • Gerard, L. F., Spitulnik, M. and Linn, M. C. (2010). Teacher use of evidence to customize inquiry science instruction. Journal of Research in Science Teaching, 47(9), 1037-1063.
  • Gillies, R. M. and Nichols, K. (2014). How to support primary teachers’ implementation of inquiry: teachers’ reflections on teaching cooperative inquiry-based science. Research in Science Education, 45(2), 171-191.
  • Goldston, M. J., Dantzler, J., Day, J. and Webb, B. (2013). A psychometric approach to the development of a 5E Lesson plan scoring instrument for inquiry-based teaching. Journal of Science Teacher Education, 24(3), 527-551.
  • Hilby, A. C., Stripling, C. T. and Stephens, C. A. (2014). Exploring the disconnect between mathematics ability and mathematics efficacy among preservice agricultural education teachers. Journal of Agricultural Education, 55(5).
  • Hillman, S. J., Zeeman, S. I., Tilburg, C. E. and List, H. E. (2016). My Attitudes Toward Science (MATS): the development of a multidimensional instrument measuring students’ science attitudes. Learning Environments Research, 19(2), 203-219.
  • Hodgson, C. and Pyle, K. (2010). A literature review of Assessment for Learning in science. Slough: Nfer. Available at:
  • Ketelaar, E., Koopman, M., Den Brok, P. J., Beijaard, D. and Boshuizen, H. P. A. (2014). Teachers learning experiences in relation to their ownership, sense-making and agency. Teachers and Teaching: Theory and Practice, 20(3), 314-337.
  • Krueger, R. A. and Casey, M. A. (2015). Focus groups: A practical guide for applied research (5th ed.). Singapore: Sage publications.
  • Laevers, F. and Peeters, A. (1994). De Leuvense betrokkenheidsschaal voor leerlingen LBS-L: Handleiding bij de videomontage. Leuven: Centrum. Leuven: Centrum voor Ervaringsgericht Onderwijs.
  • Murphy, C., Murphy, C. and Kilfeather, P. (2011). Children Making Sense of Science. Research in Science Education, 41, 283-298.
  • Murphy, C., Varley, J. and Veale, Ó. (2012). I’d rather they did experiments with us…. than just Talking: Irish children’s views of primary school science. Research in Science Education, 42(3), 415-438.
  • Onstenk, J., Kallenberg, T., Koster, B. and Scheepsma, W. (2011). Ontwikkeling door onderzoek. Amersfoort: Thieme Meulenhoff.
  • Osborne, J., Simon, S. and Collins, S. (2003). Attitudes towards science: A review of the literature and its implications. International Journal of Science Education, 25(9), 1049-1079.
  • Schildkamp, K. and Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education.
  • Schnellert, L. M., Butler, D. L. and Higginson, S. K. (2008). Co-constructors of data, co-constructors of meaning: Teacher professional development in an age of accountability. Teaching and Teacher Education, 24(3), 725-750.
  • SLO. (2017). Observatie instrument. Available at: (Accessed March 3, 2018)
  • Smith, G. (2013). An innovative model of professional development to enhance the teaching and learning of primary science in Irish schools. Professional Development in Education, 40(3), 467-487.
  • Van Aalderen-Smeets, S. I., Walma Van Der Molen, J. H. and Asma, L. J. F. (2012). Primary teachers’ attitudes toward science: A new theoretical framework. Science Education, 96(1), 158-182.
  • Van den Hurk, H. T. G., Houtveen, a. a. M., Van de Grift, W. J. C. M. and Cras, D. W. P. (2014). Data-feedback in teacher training. Using observational data to improve student teachers’ reading instruction. Studies in Educational Evaluation, 42, 71-78.
  • Van Geel, M., Keuning, T., Visscher, A. J. and Fox, J.-P. (2016). Assessing the Effects of a Schoolwide Data-Based Decision Making Intervention on Student Achievement Growth in Primary Schools. American Educational Research Journal, 53(2), 360-394.
  • Van Veen, K., Zwart, R. and Meirink, J. (2012). What makes teacher professional development effective? A literature review. In M. Kooy and K. Veen (Eds.), Teacher learning that matters: International perspectives. (pp. 3-21). New York: NY: Routledge.
AMA 10th edition
In-text citation: (1), (2), (3), etc.
Reference: Bom PL, Koopman M, Bijaard D. Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education. European Journal of STEM Education. 2019;4(1), 09.
APA 6th edition
In-text citation: (Bom et al., 2019)
Reference: Bom, P. L., Koopman, M., & Bijaard, D. (2019). Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education. European Journal of STEM Education, 4(1), 09.
In-text citation: (Bom et al., 2019)
Reference: Bom, Peter L., Maaike Koopman, and Douwe Bijaard. "Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education". European Journal of STEM Education 2019 4 no. 1 (2019): 09.
In-text citation: (Bom et al., 2019)
Reference: Bom, P. L., Koopman, M., and Bijaard, D. (2019). Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education. European Journal of STEM Education, 4(1), 09.
In-text citation: (Bom et al., 2019)
Reference: Bom, Peter L. et al. "Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education". European Journal of STEM Education, vol. 4, no. 1, 2019, 09.
In-text citation: (1), (2), (3), etc.
Reference: Bom PL, Koopman M, Bijaard D. Student Teachers’ Use of Data Feedback for Improving their Teaching Skills in Science and Technology in Primary Education. European Journal of STEM Education. 2019;4(1):09.
Related Subjects
Science Education, STEM Education
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Submit My Manuscript

Phone: +31 (0)70 2190600 | E-Mail:

Address: Cultura Building (3rd Floor) Wassenaarseweg 20 2596CH The Hague THE NETHERLANDS


This site is protected by copyright law. This site is destined for the personal or internal use of our clients and business associates, whereby it is not permitted to copy the site in any other way than by downloading it and looking at it on a single computer, and/or by printing a single hard-copy. Without previous written permission from Lectito BV, this site may not be copied, passed on, or made available on a network in any other manner.

Content Alert

Copyright © 2015-2021 Lectito BV All rights reserved.