I’m not a programmer so I have to find ways to get things done until I learn more. Tom Woodward ‘Text Acrobatics in Google Spreadsheets’

I have been spending quite a bit of time lately thinking about how to improve my processes. Here is another example.

I was recently asked to review the setup of a number of schools that I am a part of supporting. At the moment there are nearly three hundred schools. Once upon a time when there were only fifty or so schools. Back then, such a task was tedious, but not that time consuming. However, as the numbers increase, such reviews start to become a cumbersome affair.

The focus of my analysis was understanding the rollover rules for each semester. This started with the creation of a unique list of all the schools in the data set using the UNIQUE formula. With this list, I also created an additional number column in which I numbered each school. I then used a VLOOKUP to add this information to the data so that I could visually break-up the data using conditional formatting with the custom formula.

= ISEVEN($B2)

For the next step, I created a data validation to help categorise what sort of setup a school had. In addition to this, I created a data validation for the issues column produced from the column itself. To do this I created a separate UNIQUE column, summarising all the issues. I then referenced the UNIQUE column and simply chose ‘show warning’ for an invalid responses. This way, if there was an issue I had previously entered, I simply selected them from the dropdown list.

Once complete, I created a series of queries MATCHED with the various issues and then used a JOIN to summarise all the schools associated with each issue in one cell:

=ARRAYFORMULA(JOIN(", ",TRANSPOSE(QUERY(A3:J202,"SELECT A WHERE J matches '.*Description of the issue.*'"))))

On reflection, I wondered how I could cut down on this manual process. My intent was to create a template in which I could load the raw data and then have it broken down for me using formulas.

After spending a lot of time trying to reverse engineer the output I wanted using various IF, FILTER and UNIQUE formulas, I realised that rather than trying to embed everything in one formula, I maybe better suited in creating an additional table to help further organise the data.

The first step was to pull the data into this helper table using a QUERY. This means that once loading external data into sheet1 (the default first sheet), I just needed to refresh the query for everything to work.

In addition to this, I created a unique reference for each line by joining together the different pieces of information associated with each line to make a unique reference.

=TEXT(JOIN("",C2:F2),"")

In a new tab, I put together an analysis of the schools. I started this by generating a UNIQUE list of all the schools.

=UNIQUE(DATA!A2:A)

Using this data as a reference, I then created a query for each school, joining the different rollover rules together.

=(JOIN("|",QUERY(DATA!A$2:I,"SELECT I WHERE A = '"&A2&"'")))

I then used this data to produce a summary of the data. This began with a UNIQUE list of combined rollover rules.

=UNIQUE(QUERY(SCHOOLS!A2:C,"SELECT B,C WHERE A IS NOT NULL"))

Using this as a reference, I queried all the schools with the same rule and joined this data together.

="E"&JOIN(",E",QUERY(SCHOOLS!A2:B,"SELECT A WHERE B ='"&C2&"'"))

I also created another columns in which I used a SWITCH formula to describe each of the sets of rules in a more meaningful manner beginning with a category: “standard”, “two semesters” and “needs to be fixed”. Associated with this, I created a column which pulled this category and then used this information to create a series of conditional formulas in order to visually differentiate the different setups.

For anyone interested, a copy of the template can be found here.


After completing this exercise, I have found that I can easily adjust this template for any other data I wish to analysis quickly. It is also testament to the power of making your own tools. I am left wondering if I could use macros or scripts to improve the process, but for now my formulas all duct taped together is working for me. As Tom Woodward sums up:

I’m not a programmer so I have to find ways to get things done until I learn more.

If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

If you’re trying to understand complex formulas, peel the layers back until you reach the core (which is hopefully a function you understand!). Then, build it back up in steps to get back to the full formula. Ben Collins ‘Use The Onion Framework To Approach Complex Formulas’

I was recently asked to have a look at spreadsheet that listed all the guides and videos located in a support folder as a reference. This folder also included a number of folders within folders, which created a level of complexity.

The current workflow involved using an Add-on File Cabinet from the developers behind Awesome Table to generate a list. The problem with this list is that the exported data was not in the desired format. Whereas the output focused on file, folder and link, the list produced using File Cabinet included other information such as owner, creation date and file size. In addition to that, there were some folders and files listed that needed to be removed. Therefore, it had become manual heavy exercise to refresh the data and update the directory meaning it did not happen that regularly.

After looking at the list and the current finished product and the list produced using File Cabinet, I realised that rather than displaying the sub-folder, that the directory was showing a top level folder. Therefore to manage this, I used a SWITCH formula to create a new column updating the folder name to the overall area it belonged to.

To create the SWITCH, I created a UNIQUE list of all the folders and then pasted them as values so that I could delete those not required to be displayed in the directory. After culling the list, I then added the area in the cell next to each folder. Once I had these two columns complete, I used this formula to add quotation marks to them all:

=ArrayFormula(""""&Sheet1!A:D&"""")

I then copied this list into the SWITCH formula and added in the commas:

=IFNA(SWITCH(B1:B,
" EOY 2020","EOY",
"Absences and Attendances","eSIS",
"Class Maintenance","eSIS",
"Crystal Reports","eSIS",
"Debtors","eSIS",
"eSIS - Information Sheets","eSIS",
"eSIS Navigation-Overviews","eSIS",
"General _ Misc","eSIS",
"Managing Community","eSIS",
"Managing Student Lifecycle","eSIS",
"Reporting","eSIS",
"Staff Maintenance","eSIS",
"Student Medical Maintenance","eSIS",
"SynWeb","eSIS",
"Timetabling","eSIS",
"eLearn General ","eLearn",
"eLearn QRGs","eLearn",
"Accounts Payable","eFIN",
"General","eFIN",
"Purchasing","eFIN",
"ePortal","ePortal",
"Create New Staff","eHR",
"eHR - General","eHR",
"eHR Materials","eHR",
"Employee Self Service","eHR",
"Employee Self Service (ESS)","eHR",
"End of Year 2020 - eHR & Payroll Guide","eHR",
"Leave","eHR",
"ICON General Information","General Information",
""," "
))

With this addition column, I then used a QUERY to capture the title, area and link to present as a directory in a separate spreadsheet to share with a wider audience:

=SORT(QUERY(IMPORTRANGE("docs.google.com/spreadsheets/d/.../edit","Sheet2!A:D"),"SELECT Col1,Col4,Col3 WHERE Col4 is NOT NULL"),2,FALSE)

Ideally, if creating this solution from scratch, I would probably have started each folder with a prefix indicating which area it belonged to, therefore avoiding the need for the SWITCH formula. However, that horse had long bolted. I also like the idea of using the SWITCH formula to manage which folders are displayed. It is definitely easier to pass on to somebody else.


I feel that it would be plausible to find a different script (see for example these examples from Alice Keeler, mesgarpour and Spreedsheet Dev) that might do some other things, such as run the process as a cron job or pull only the desired data. However, that is beyond my current skillset and patience level to dig any further at this point.. Therefore, I will stick to using various formulas to filter out the data for me.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

The challenge to me is to go beyond the question of instruction and understanding of different languages. Beyond debates about fitting it within an already crowded curriculum. Instead the focus should be on creating the conditions in which students are able to take action and create new possibilities. Maybe this involves Minecraft, Ozobot or Spheros, maybe it doesn’t. Most importantly it involves going beyond worrying about training or competency, as Ian Chunn would have it, and instead embracing the world of making by leading the learning. - Aaron Davis ‘Did Someone Say Coding?’

In my work, I recently had an issue with absence data where the system had generated some duplicate data. Short of creating an actual report within the system (which I neither have the time or access to do), I decided to have a go at creating a reusable template that I could easily share with anyone. The idea was to take the data and produce a number of specific outputs, including the duplicates and a specific set of ‘All Day’ duplicates.

I started the template by creating a spreadsheet with three tabs: Sheet2, DUPLICATES and ALLDAYDUPLICATES. Sheet2 contained a summary of the ID, event data, name, absence type and event type.

=QUERY(Sheet1!A1:R,"SELECT A,R,E,F,G WHERE A IS NOT NULL")

I then made a column of unique data by combining the ID and event date. To do this, I used Ben Collins’ trick using formulas in a named range as a means of capturing the variability of the data.

=ARRAYFORMULA(CONCAT(INDIRECT(IDS),INDIRECT(DATES)))

With IDS named range being:

="'Sheet2'!A2:A"&COUNT(Sheet2!A2:A)+1

and DATES named range being:

="'Sheet2'!B2:B"&COUNT(Sheet2!B2:B)+1

I then used the column of unique data to create a TRUE / FALSE column as to whether the absence might be a duplicate.

=ARRAYFORMULA(countif($F:$F,(INDIRECT(UNICON)))>1)

With the UNICON named range being:

="'Sheet2'!F2:F"&COUNTA(Sheet2!F2:F)+1

I borrowed this formula in part from Collins who used something similar to use Conditional Formatting to highlight duplicates:

=countif(A:A,A1)>1

In the DUPLICATES tab, I created a QUERY that brought in the duplicates ordered by date

=QUERY(Sheet2!A:G, "SELECT A,B,C,D,E WHERE G = TRUE and A IS NOT NULL ORDER BY B")

While in the ALLDAYDUPLICATES I created another QUERY which focused on ‘All Day Absences’

=QUERY(DUPLICATES!A:E, "SELECT A,B,C,D,E WHERE E = 'AllDayAbsence'")

With all this in place, to query any data, I make a copy of the template. I then import the absence data exported from the external system. Although I could possibly copy and paste the data, I have found that this is fraught with dangers. In regards to the ‘Import Location’ I select Insert New Sheet(s). This then loads the new data into Sheet1. I then click on the formula in Sheet2!A1 and press enter to refresh the QUERY formula and populate the rest of the spreadsheet.

For those interested, a copy of the spreadsheet can be found here.

Growing up, I never really went deep into the world Microsoft Excel. It was not until exploring Google Sheets that I started appreciating the power and potentials. One of the consequences of this is that a lot of the habits that I have learnt are not applicable in reverse. For example, Ben Collins has a site discussing the power of moving from Excel to Google Sheets. This includes a discussion of IMPORT features and the QUERY function. However, there is nothing that I am aware of that goes the other way. I guess this is the nature of coding in general. Every language has its own set of affordances and we work within these constraints. What the situation has taught me though is that sometimes solving problems is about working with what is at hand. I guess the challenge is developing our box of tricks so that we do have something at hand. In some ways, this reminds me of something Amy Burvall once said:

“in order to connect dots, one must first have the dots”

So what about you? What small tools have you created to solve a particular problem? What are the strategies that you use to continually add to your toolbox so that you are always ready when a problem may arise? As always, comments and webmentions welcome.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

When I was doing a bit of coding every day, I found I could much more quickly grasp key concepts. But if I stopped for a few days or, every so often, a few weeks, when a crush of work in my day-job and a load of personal-life responsibilities arrived, it was like wiping the slate clean. I’d come back to work on a coding project and I’d have forgotten a shocking amount of basic stuff. Clive Thompson ‘Ten Lessons I Learned While Teaching Myself to Code’

The current pandemic has led to many changes in habits. One of which is that I like to be prepared when I go to the supermarket, especially when doing a big shop. Fine I may not last out the two to three weeks that Zeynep Tufekci flagged early on:

For food, you can just buy two or three weeks’ worth of shelf-stable food that you would eat anyway, and be done; this could include canned food like beans and vegetables, pasta, rice, cereals or oats, oils/fats, nuts and dried fruits. It’s really not that hard because we’re talking two-three weeks, so whatever you get is fine. It doesn’t have to be expensive or super healthy or specialized ready-to-eat meals in camo boxes guaranteed to survive the meteor strike! Rice, beans, salsa, ramen, some sort of cooking oil, oatmeal, nuts and dried or canned fruits and vegetables enough for two weeks can be had at relatively little cost and take up fairly little space.

However, I at least try and limit how often I go out. That is usually the intent of lockdown measures (Melbourne is currently in its fifth lockdown at the point of writing) used to hammer the virus. One strategy I have used is to be clear about what I might need at the shops split into different sections. As a part of this, I wrote out a list of essential items and have been using this to create the weekly shopping list. Today, I decided to have a go at turning this into a spreadsheet using Google Sheets that I could use to generate the list. Here then are my steps:

List of Items

I started by writing something of a complete list of items. Associated with this, I categorised each item in a separate column. To save from writing each category each time, I created a separate list of unique categories and then used this with data validation to create a dynamic drop-down list. This meant that if I added a new category it would then be added as an option. In a third column, I added a checkbox for each item to be used to produce the weekly list.

Switch the Category

Added to the category, I used the SWITCH formula to create a sort order.

=IFNA(ARRAYFORMULA(SWITCH(B2:B,"Veg",1,"Fruit",2,"Meat",3,"Dairy",4,"Bakery",5,"Sweets",6,"Non",7,"Freezer",8,"Other",9)))

Generating the Summary List

Once the items required were ticked, I wanted a summary that I could copy into a message. My initial iteration was a simple query: =QUERY(Sheet1!A2:D,"SELECT C WHERE D = TRUE ORDER BY A") The problem with this is that it did not put each item on a new line. To fix that, I used the JOIN function and the New Line character.

=JOIN(CHAR(10),QUERY(Sheet1!A2:D,"SELECT C WHERE D = TRUE ORDER BY A")) 

Although this put each item on a new line, I then wondered about adding an emoji for each section to break up the information. To do this, I combined the new line and emoji characters, with a separate query for each category.

=""&CHAR(129477)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 1 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(127822)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 2 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(129385)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 3 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(129472)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 4 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(127838)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 5 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(127851)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 6 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(129387)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 7 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(129482)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 8 and D = TRUE"))} &""&CHAR(10)&""&CHAR(10)&""&CHAR(129531)&""&CHAR(10)&""& {JOIN(CHAR(10),QUERY(A2:D,"SELECT C WHERE A = 9 and D = TRUE"))}

Resetting Selections

The last task was to add a script to reset the checkboxes. For this, I added a script via app scripts I found here and created a button as a trigger. You can make a copy of the whole spreadsheet here.


When it comes to coding, Clive Thompson talks about learning to code by doing something every day and doing so with purpose. Here is another example of a solution that is as much about learning as it is about the solution itself. Comments and recommendations on improvements welcome.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

The power of query in sorting out data in Sheets

One of the challenges we have within the project I am currently a part of is that we have never had a contextual reference point for school information. Although different teams have had various lists floating around, we needed one that we owned as a whole. Added to this, the lists were often kept in an Excel spreadsheet meaning nobody ever really knew if they had the definitive version. Being an organisation that uses Google, I suggested to a colleague that we work together to develop a spreadsheet in Google Sheets and share a link with colleagues. Here then are some of the steps we have taken in developing this:

Data in, Data Out

If there is one thing that I have learnt from Ben Collins’ work, it is the importance of cleaning up your data before you do anything else. This includes avoiding merged cells, an issue we had with older versions of the data. The focus was creating a dataset with each cell telling a particular story. Other than using CTRL+ENTER to separate the different parts of addresses and locking the header, there was no formatting applied to the core data.

We were also deliberate with how we collated the information. With 40+ columns, we spent some time splitting the data into four groups – details, contacts, administration and learning – with empty columns at the end of each in case additional fields needed to be added at a latter date. Although I wanted to use Google Forms to structure all this, we decided not to because of the fluid nature of the dataset. There are also times when we wanted to be able to update the data in bulk for various reasons.

Ranges, Imported and Named

So that people  were able to access the information, a second document was set up, with the data brought in via the IMPORTRANGE formula.

=IMPORTRANGE("https://docs.google.com/spreadsheets/d/abcdefghijklmnopqrstuvwxyz/edit", "CONTACTS!A1:CA500")

This meant that the wider team could have access to the content, without messing up the original data. Although we could have restricted access to the tab or range in the source document, I was concerned that this would be fiddly to maintain, therefore chose a separate document altogether.

With the IMPORTRANGE, I imported this into a sheet and created a named range to reference this.

MAIN!A1:BE500

I tried to embed the IMPORTRANGE within my formulas, but for some reason it would not work, even after I gave permission. Therefore, I resorted to simply bringing in a copy working with this. I also thought then there is only one external call running.

Different Data for Different Purposes

Once I had my data in place, I used a QUERY to reorganise it. In addition to having a single point of reference, my argument to my colleague was that we can represent this data in different ways depending on the purpose. As David Krevitt explains:

QUERY combines all of the capabilities of arithmetic functions (SUM, COUNT, AVERAGE) with the filtering abilities of a function like FILTER.

Using the QUERY formula I created a copy of the core data removing any blank columns.

=QUERY(MAIN,"SELECT B,C,E,D,F,G,J,K,L,M,N,O,P,Q,R,S,T,U,V,Y,Z,AA,AB,AC,AD,AE,AI,AJ,AK,AL,AM,AN,AO,AP,AQ,AR,AS,AT,AZ WHERE C IS NOT NULL ORDER BY B")

As I set the data up with space to grow, I used

WHERE C IS NOT NULL

This removed any blank rows. I also used

ORDER BY B

As this column contained the school numbers (i.e. E1234). In order to get this to work, we removed the E from the source document and used the custom formatting to add the E back to the number, while also being able to sort numerically.

In addition to the main list, I created a number of specific queries, focusing on things such as pay periods, business managers and applications being used.

=QUERY(ICON, "SELECT B,C WHERE X = 'Yes' ", -1)

I tried to make a query where the user would tick a checkbox associated with the information required, but could not figure how to create a variable associated with the SELECT function.

Custom Lookups

Another way of engaging with the data was to focus on a particular school. To do this I created a series of VLOOKUP formulas revolving around the E number which was strategically placed in the first column of the dataset.

=VLOOKUP(B$2,vLookup,5,FALSE)

In the cell above, B2 is the cell where the school number is entered and 5 is the column for the information to be displayed. I then repeated this formula for all the other information to be displayed.

Another use was to create a lookup for the business managers associated with the different schools.

=VLOOKUP(A2,BM!Z1:AH500,2,FALSE)

This involved making a new named range with the name of the business manager in the first column. I also used data validation restricted to the names in the dataset.

I also then combined this search key with the QUERY formula to display the list of schools and their core information:

=QUERY(MAIN,"Select B,C,E,J,AZ WHERE Z = '"&$A$2&"' ORDER BY B")

Mix and Match Formatting

In order to make the data a bit more friendly, I added some conditional formatting. The challenge was that I wanted different colours for different groups of schools. I did this using the school number as the reference:

=left($A2,2)=”E1″

I then used an even formula to break up the lists:

=ISEVEN(ROW())

By placing the even conditional formula at the top of the list, this means that it is prioritised.


I have included this all here as a reference. I am not sure this is much use to anyone. As always, thoughts and questions welcome.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

The power of query in sorting out data in Sheets

There are many challenges to sharing specific data in Google Sheets, some of these can be overcome using the QUERY formula.


I attended a day recently continuing the look into ongoing reporting. One of the focuses involved reflecting on various points pf data. Something that stood out was the willingness of teacher to share data. Whether it be with students, teachers or parents, for some providing access can still be a challenge.

I discussed this with a principal attending the day and one suggestion made is that it can be hard to share particular data, without sharing everyones data. This is especially the case when talking with parents or conferencing a student. An answer is using the QUERY formula.

With Google the chosen platform, a lot of teachers store their data in Google Sheets.

QUERY combines a whole lot of functionality into the one formula. As David Krevitt explains:

QUERY combines all of the capabilities of arithmetic functions (SUM, COUNT, AVERAGE) with the filtering abilities of a function like FILTER.

It is a language developed by Google, using the principles of Structured Query Language (SQL).

For example, you may wish to share the results of just one students:

To do this, you select the columns you want to copy from the MASTER tab and which value you wish to filter by:

=QUERY(MASTER!A:F, "SELECT C,D,E,F WHERE A = 'Donna'", -1)

If you wished to quickly create a tab for each student, Alice Keeler has created a script for generating tabs from a list. This personalised information can then be shared with students (see Jake Miller’s explanantion). Another thing to consider is to protect formulas by adjusting permissions at a celular level, as well as prevent others from copying the file, therefore getting the information that way.

The other option is to create a dynamic selector involving either a name:

Or even a whole form:

These options might be used when working with colleagues or talking with parents. You are able to bring up just the information required. Depending on the data, you can also create dynamic charts. One other benefit to using the QUERY formula is that it allows you to quickly and easily reorder the representation of data. So lets say ‘F’ is associated with literacy testing and you would like that at the start. Rather than writing SELECT C,D,E,F you would write SELECT F,C,D,E.

This is only the tip of the iceberg of what the QUERY formula can do. For more information, see posts from Ben Collins and David Krevitt. Collins also ran two webinars, which you can go back and watch. One on the basics, while the other getting a bit more complicated.


One thing to note when using Google Sheets to store data is what sort of information you are collecting. In some districts and regions there are issues raised about storing ‘sensitive data’ in platforms like Google.

Reflecting on Class Dojo, Ben Williamson explains that ‘sensitive’ can be the consequence of collecting data:

The ‘sensitive information’ contained in ClassDojo is the behavioural record built up from teachers tapping reward points into the app.

This same concern needs to be considered in regards to Sheets, especially with the changes being brought about by GDPR.


As always, comments welcome. Webmentions too.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

The power of query in sorting out data in Sheets

My first iteration using Query and Sheets to automate a solution for turning a collection of data into a regular newsletter.

This year I decided to create a monthly newsletter collecting together the updates and resources that I came upon during the previous month. As I have reflected elsewhere, I realised that there was an opportunity to automate some of this process. Although some spoke about Pinboard or generating a post via email, I was interested in sticking to GSuite. I therefore began with the aim of generating a solution within Sheets. So here is my first iteration of an automated solution for turning a collection of links into a summary.

Organising the Data

I remember being in a session with Jay Atwood a few years ago talk about the importance of considering the way you collect your data before anything else. As I looked at my databases, one for updates and the other for resources, I realised that the first thing that I needed to do was reorganise the way that I was storing information. This included restricting the options associated with type and application, as well as separating the link and title and then smashing them together using the HYPERLINK formula. Inspired by Ben Collins’ post on working with text, I also created a column summarising the information in each row into Markdown summary.


Reorganised Data w/ Formulas by mrkrndvs is licensed under CC BY-SA

From Lookups to Queries

Once I had my data organised, I then started explored sorting and shaping the data. I began with a VLOOKUP with a dynamic selector. This allowed me to filter it in different ways. However, I quickly realised that this was limited. I turned to QUERY.

I remember David Krevitt talking about QUERY, describing it as the, ” big kahuna of Sheets functions.” I think this initially put me off. This time I opened up a number of guides from Krevitt, Collins and Anand Varma and dived in. This lead me to rewrite my VLOOKUP as a QUERY.

Bit by bit I stretched the solution. I began with a dynamic selector to represent variables and explored the ability to define queries by date. I then created a prototype with a query for each application across the sheet.


1st Iteration by mrkrndvs is licensed under CC BY-SA

Once I had that working, I create a vertical set of queries. To allow for the variable of the unknown number of posts each month, I left 30 blank rows between each formula.

Filtering Results

To get rid of the spaces and the data headings, I used a FILTER formula and removed the spaces and column headings produced by the QUERY formulas. This left me with a choice, copy the MarkDown data and paste it without formatting, therefore removing the table/sheet that it was in, or using the add-on Sheets to Docs to copy the text to a Google Doc.


So that is the first step in my solution using Sheets to generate the text for a newsletter. My next challenge is transferring this to a Google Script. If you have any thoughts and advice about this, I would greatly appreciate it. Otherwise, as always feel free to leave a comment.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

Tom Woodward on APIs and automation

Building on the APIs provided by Google Sheets and Google Calendar, I demonstrate how to automate the addition and change of multiple events.


I have been spending some time developing resources to support schools with timetables and reports. One of the things that occurred to me is the challenge of remembering to complete a number of key steps throughout the year. I therefore thought of creating a digital calendar that could be shared with schools that would help remind them.

In my search for a tool that would help with this process, I discovered From Sheets to Calendar, a Google Sheets add-on, that would allow me to create a series of events from a spreadsheet. As I explored this, it occurred to me that this might have ramifications for other groups in my organisation, especially those coordinating professional development. I have seen many plan things out visually in a spreadsheet. However, this means then creating these events again in a calendar. Here then is a guide to managing events with GSuite and sharing them with others.


Setup Calendars

Before setting up the various events, you need to make sure that you have created the various calendars. For example, you might have one for ‘meetings’, ‘professional development’ and ‘events’.

First Start

With everything set up in Google Calendar, install the add-on in Google Sheets. Once this is done, go to Add-ons menu and run ‘FirstStart’ to populate the template to work with.

Add in Events

With all the headings provided, enter the various information, such as title, time, location and description. Also, make sure ‘Add’ is listed against each of the events in the Action/Status column.

Import to Calendar

Once the events have been added, go to the Add-ons menu and run the ‘Import to Calendar’ to create events. Once created, there is an option to update and delete by changing the request in the Action/Status column.

Share the Calendars with Others

Although it is possible to send invites via the sheet, the other option is to share the particular calendars. For schools using Microsoft Outlook, Google Calendars can be shared as an ‘internet calendar’. The other option for users without a Google Account is to download and share a copy of the iCal file. The problem with this is that recipients will not be able to receive updates if there are any updates.


One of the challenges with a solution like this is that it is dependent on someone else’s scripts and support. In addition to this, to create more than 20 events, you need to pay a subscription of a $1 US a year. For those wanting an enterprise option, Zapier provides the connections, but it comes at a cost of $25 per month. There are also a range of scripts to build upon in Github. Another option is to manually import a CSV spreadsheet. So what about you? Are there any processes that you use when automating the creation of calendar events? As always, comments welcome.

Postscript

The Add-on, From Sheets to Calendar, has been removed from the Google Workspace Marketplace, however there are a number of other options. For me it highlights the benefits of solutions like Zapier which manage the connects and any changes seamlessly in the background.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

Spreadsheets

A plan for an automated monthly newsletter produced from Google Sheets. The intention is to develop data in a way that it can be used in a number of ways.


I recently wrote a post reflecting on the Digital Technologies curriculum. One of things that I realised through the process is that I often wait to the end to discuss my projects. Although this can be useful in providing an overview of learning and achievements, it does not necessarily allow others a means to provide feedback early on. I usually ask questions online, but this often lacks context. So this post is an attempt to plan out a new project, with the hope that others might be able to provide advice and guidance.

This year I started a monthly newsletter associated with Google. With GSuite the chosen learning and teaching platform in my organisation, I thought it would be useful to summarise the various resources for others. I started with a Google Doc, organising the various links under headings associated with the featured application, as well as a section documenting the overall updates.

This has ebbed and evolved as the year has gone on, with a clear order of applications to correspond with a range of modules. However, the question that has arisen is whether there is a better way of recording the various links and updates so that they are easily searchable.

Currently, you can go back through the various posts and look for resources, but this is both cumbersome and tedious. It therefore had me think about storing the links in a Google Sheet and possibly generating the monthly summary/newsletter from that.

I know that I could probably do this with a social bookmarking platform or even a blog, but I feel that putting the information into a spreadsheet provides more operability. It would mean that the data would be in a format with which I could present it a number of ways. It also means the links could be recorded using something as simple as Google Forms.

I am therefore thinking of creating a script in Sheets that collates all the links for the month in a Google Doc. To be honest, Google Apps Script is all still new to me, but I am wondering about the possibility of creating a template with merge fields. I remember Autocrat doing something similar. I could then use this to post in WordPress.

I am left with a number of questions, such as how should I action the script? Would it need some sort of selector or could it be done automatically? How customisable are templates? Could I generate a markdown version for the purpose of posting?

Maybe you have an idea or a post that you would recommend checking out before beginning or just a tip of where to start. As always, comments welcome.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.

I recently came across a post from Jennifer Kloczko discussing the notion of original ideas. In it she documented some of the ideas that she has implemented this year and where she ‘stole’ them from. I too have written about this before, discussing the benefits of sharing and working collaboratively. However, one aspect that Kloczko does not necessarily address is the way in which ideas ‘stolen’ can morph and evolve as they become ingrained in other contexts. This process often starts with a problem. For me, this problem is Synergetic.

This year I have spent a lot of time learning about different facets of Synergetic. A ‘total solution’, Synergetic is a management system with a focus on administration. My work has included developing a reporting solution, setting up the attendance process and configuring online spaces. One particular area that has absorbed quite a bit of my time though has been timetabling.

For secondary schools this is more obvious. You link in a third party applications, such as Timetabler or Edval, to manage things. This however is not the case for primary schools. They have no need for the intricacies of a robust timetabling package, the issue though is that they still need a timetable.

The solution Synergetic offer is a lightweight application called Primary Time. It allows users to create timetable blocks and place them within a visual grid. I have two concerns with Primary Time. The first is that you need to populate a lot of information associated with rooms, teachers, groups and subjects that does not flow through to the timetable file produced. Although there is the means of uploading this information in Primary Time via CSV files (click here for a copy of the different files), these files still need to be made beforehand and maintained moving forward. The second issue is that there is not a direct connection between the two applications. Unlike other software packages that develop a constant connection that allows for a flow of information back and forth, users are required to manually download a file from Primary Time and upload this into Synergetic.

The problem with all of this is that the timetable applications ideally act as the source of truth when it comes to timetable related information. That means if Primary Time were to be used in this way, users would need to follow the tedious cycle of updating Primary Time and then reloading it, every time a change needed to be made, no matter how large or small this change may be. (This issue is compounded by the fact that you cannot download timetable related information from Synergetic and upload it into Primary Time.)

Another factor at play is the reality that most primary schools do not require an explicit period-by-period timetable. Instead, timetables are usually developed around exceptions, with the rest of the time being allocated to a classroom teacher, allowing the to balance and bend their teaching time. Locking in a highly descriptive timetable therefore serves little purpose and is often a hindrance, rather than a help. For most primary schools in Victoria the timetable is required for roll marking purposes, with primary schools mandated to enter results for AM and PM.

An alternative to using Primary Time is manually entering the timetable within Synergetic. Although this is an option, especially when generating roll marking periods, it to is still a very tedious process and not an ideal solution.

After these initial experiences, I was left with the question

How might we use another application to make the development of a timetable for schools easier and more efficient?


At the end of the day, all that Primary Time does is produce a CSV file with six columns: day, period, form, class, room and teacher. This seemed quite simple. I therefore started by trying to reproduce a school timetable by cutting and pasting cells in Google Sheets. This worked, however it was still fiddly, therefore not a feasible solution.

I wondered if there was some way of automating this process, at least generating a basic timetable that could be manipulated. I remembered reading a post a while back from Martin Hawksey documenting a formula for repeating data. (A win for serendipity.) I started with this and then progressively unpacked each column further, addressing the particular requirements. I must admit, I did get some additional help from Hawksey on the formulas that I had conjured together. I eventually managed to put something together.

It involved entering the days in the timetable, the sessions running each day and a list of classes with their rooms and teachers. It would then generate a list with a copy of each class for each session. There were two problems with this. Firstly, it was not easy to add in the specialist classes. Secondly, it was still unwieldy and confusing.

Fine I could make a copy of the timetable adjust it in order to add specialists, one of the challenges with this is working with the timetable in a list format. The ability to visualise the timetable is one of the benefits of using an application like Primary Time. My solution was to recreate the lists as a dynamic table, with a dropdown button to choose the teacher.

To make this, I built on the work of Ben Collins and David at CIFL around the use of the VLOOKUP formula. It also meant I had to INDEX the table to start with the form column. In addition to this, I made a dynamic selector created using data validation to choose the class. Although this answered the visual problem, as soon as different variables (10 day timetable or 8 period day) were added then the table would break. Maybe there is another way I could do this, but I felt like I hit a wall, so I decided to focus on making a simpler solution.

I had started out with the intent of making it easier to create a timetable for Synergetic, so rather than worry about creating a full timetable, I instead turned my attention to creating a timetable solely for timetabling purposes. Rather than create a line for each period, I focused on creating a line for each roll marking period. This way I did not have to worry about anyone making sense of the lists of periods and classes, instead the end user would enter their values and download the corresponding CSV file. The problem that remained was how to make this fail proof.

Spreadsheets can become busy places very quickly and the sight of lengthy formulas puts a lot of people off.

=TRANSPOSE(SPLIT(REPT(JOIN(“,”,ARRAYFORMULA(REPT(SPLIT($H$4,”,”)&”,”,$I$3))),$I$2),”,”))

The challenge then was to focus on inputting the values. I moved the inputs from the page with the formulas and made a separate sheet for that information. After some feedback I then split this information again, with one sheet providing a space to list the teachers, classes and rooms, while the other sheet providing a summary of the days in the timetable and the roll marking periods. Although this hopefully made it easier, there was still the challenge of downloading the CSV file.

To me the process was clear. Enter the classes and definitions, then download the timetable file. The process though of clicking on the right sheet, going to File and then downloading a CSV provided too many concerns. I asked around if it would be possible to turn it into a script. Some colleagues said yes, but suggested just focusing on the downloading of the CSV, with that being the particular point of contention. So I did what I often so, returned to Google.

I have wanted to explored Google Apps Script for a while, but always found other things to distract me. Finally I had a clear purpose. To start off I worked through Ben Collins’ introduction to Apps Script. It provided a useful starting point and a button for my script, but it did not address my particular challenge. After reading numerous forum posts and scrolling through the Developer Page and Drive APIs, I stumbled upon a code that Michael DeRazon had shared on GitHub for downloading a Spreadsheet to Google Drive. As with all things borrowed, I took to bending the code to fit my needs, but none of the changes that I tried worked. I eventually had a colleague look over it and provide some guidance. He pointed out that there was a loop that was messing things up and came up with the solution of downloading the CSV file to Drive and then download to the computer.

Although this was not necessarily the solution I had hoped for, in that it downloaded a copy to Drive before downloading a CSV to the computer, it at least addressed the problem, making the creation of a timetable easier and more efficient. You can get a copy of the sheet here.


For those who got this far, well done and thank you. It would have been easy to have just shared a copy of my resource and be done with it, but I think that the real value is found in the thinking behind it all. To me, this captures the power and potential of digital technologies as summarised by Richard Olsen. He breaks it down into the following:

  • Feedback-Rich Learning
  • Reuse-Rich Learning
  • Continuously Evolving Learning

To build on Jennifer’s point at the start, there are no original ideas and the remix thereof is an ongoing process.

So what about you? What ideas have you borrowed and bent? How have you changed and extended them? As always, comments welcome.


If you enjoy what you read here, feel free to sign up for my monthly newsletter to catch up on all things learning, edtech and storytelling.