flickr photo shared by mrkrndvs under a Creative Commons ( BY-SA ) license

I received a request today for any research that could be used to show how ICT is improving student outcomes? To me, this is such a complicated problem. Firstly, what is a ‘student outcome’? Secondly, technology is a tool used to support learning, not something that can necessarily be measured in and of itself. The question then to be considered is what outcomes do you measure in order to ascertain the impact of technology? Here then is my list of possibilities:

  • Engagement: This is often the first place that people go to. Maybe this focuses on whether they associate with learning or participate. However, as David Price highlights, measuring it is not always obvious. That is, it is not always seen, not simply about test scores or having fun. Seymour Papert touches on this when he suggests that learning should involve ‘hard fun‘, where learning is difficult, rather than easy.
  • School Connectedness: This is often a barometer used in surveys like Attitudes to School. However, although such measurement is useful when it comes to well-being, it is not so obvious when it comes to technology.
  • Collaboration and Problem Solving: This is popular when it comes to 21st century learning and has received considerable focus, particularly from the ATC21s group. The challenge is often capturing the different facets of cooperative learning and where technology sits in this.
  • Learning Agency: Like engagement and connectedness, what agency means can be different for different people. Claire Amos has provided a detailed guide for introducing agency. In first place, she argues for one-to-one access, although how you differentiate this from the rest of the list I am not sure.
  • Creativity: Sir Ken Robinson describes this as “putting your imagination to work”. However, like collaboration and problem solving, this can be hard to pin down, especially in relation to tools and technology.
  • Digital Citizenship: Often people argue we should use technology as a model for life. An example of this is provided by Alec Couros and Katia hildebrandt in their digital citizenship curriculum for the Saskatchewan Schools District. Much of this is also included within the new Digital Technologies curriculum, with more focus on how technology works. Although tools like David White and Alison Le Cornu’s mapping of the web from in regards to residents and visitors provide a useful point of reflection, they do not necessarily demonstrate specific learning growth.

In the end though the problem that still exists beyond what to measure is the questions of how is the technology actually used and why. A more fruitful approach is to enter develop a holistic action research project incorporating the ioi process. Instead, people commit themselves to frameworks like SAMR to guide them. In addition to this, the reality is that a school further on the road towards normalisation is going to have more success with technology, than one at the beginning of its journey. Importantly, Mal Lee points out that,

Until school principals are of a mind to transform ‘their’ school the staff and the school’s community have little likelihood of changing the status quo.

The problem though as Paul Tozer points out is that at present, with the focus on NAPLAN and VCE, moving into the digital realm is not always a priority.

For those interested, here is a list of research, presentations and publications shared online:

As always, comments, links and suggestions welcome.

If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

Know Thy Digital Impact – A Reflection on Digital Research by Aaron Davis is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

14 thoughts on “Know Thy Digital Impact – A Reflection on Digital Research

  1. Hi Aaron,
    As always, a great read. I’ve been thinking about this issue a lot in the lead up to our own BYOD program and since it began. We ask families to pay a significant amount for these devices while there is still considerable debate about whether or not they have any appreciable impact. I believe that we shouldn’t ask students to do meaningful tasks, do good research, high level analysis and accurate simulations without giving them the tools to do this. Imagine trying to analyse the PISA data, for example, without using a computer. However, I also know it’s not enough to just say “I believe”. We need more than that to justify the cost to families of 1:1.
    I’ve taken a long time to say nothing above.
    But, I’d add to your excellent list of articles Audrey Watters’ piece from last year, which I’m sure you’ve read:

    And this, which popped up on Facebook last night courtesy of the AEU.


    • Thanks as always for the comment Eric.
      I did catch the Future Tense episode with Neil Selwyn and as always Audrey Watters provides some sense within the hysteria.
      It is such a challenging topic, I kind of feel that it is the missing element within the Edustar diagnostic tool.

  2. Neil Selwyn provides seven brief bits of advice for any teacher wanting to make sense of technology. They include:

    Be clear what you want to achieve
    Set appropriate expectations
    Aim for small-scale change
    Pay attention to the ‘bigger picture’
    Think about unintended consequences
    Technology use is a collective concern
    Beware of over-confident ‘experts’

    This reminds me of my call for pedagogical coaching when it comes to technology. Also another post to add to my list of research associated with technology.

  3. Dan Meyer discusses his simple rubric for evaluating edtech, “What happens to wrong answers?” He explains that “every wrong answer is a resource and we shouldn’t waste it.” The challenge is to value the student’s answer.
    Meyer ends his post with a collection of others useful resources associated with reviewing technology, including a framework for thinking about technology centered in equity and Robert Talbert’s post re-considering points-based scoring.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.