Seng Kwang Tan

Home-Based Learning Using Google Meet

I conducted an online lecture this morning using Google Meet for the students who had to stay home due to the Leave of Absence mandated by the Ministry since they had recently returned from another country during this period of the Covid-19 pandemic. I feel the need to document this as things might become bad enough that schools have to close, so it serves as a place where fellow teachers can pick up some tips on how to manage this.

The G Suite account that I used is that of my school’s, not MOE’s, because it allows me to record the session in case I need to show the session to students who did not “turn up” for the Meet. I am the G Suite admin for the school so I changed the setting to allow Google Meets to be recorded. After the session, the recorded Meet is automatically found in a G Drive folder after it has been processed in the backend. ICON’s Google Meet (part of MOE’s Google Suite service) does not allow recording.

My hardware setup is simple: just my laptop to capture my face and control the Google Meet UI and a second screen with which to show my slides. I also entered the Meet as another participant using my mobile phone as I wanted to see what my students would see for added assurance.

Google Meet is very user-friendly, with a minimalist and intuitive design that one can expect from Google (after all, that was what made it the preferred search engine in the early days of the internet). All we needed to do was to sign in to https://meet.google.com/ and start a session. You can also schedule a session on Google Calendar.

When a Meet is created, a URL is generated, which you can communicate to your students via text message or email, or through a system announcement.

When students log in, be sure to ask them to switch off their video and mute their voices so as not to cause any interference.

Note that what is shown in the presenter’s screen in Meet using the front camera of a laptop is laterally inverted as presenters generally want to see themselves as though they are looking at a mirror. So if you were to write things on a whiteboard or piece of paper, you will not be able to read the writing through your screen. However, rest assured that students can still read the writing if they are looking at you through the feed from your laptop’s front camera.

Instead, what I did was to toggle between showing my face on the camera and projecting a window or a screen.

For today’s Meet, I projected a window where my Powerpoint slides was on but did not go into slide mode (which will take up both my screens) as I wanted to be able to see the Google Meet UI at all times in order to know if anyone asked questions or raised an issue using the Chat function. This backchannel was very good as students could immediately tell me if they could see or hear me. I wanted them to be able to ask questions through that but nobody did, unfortunately.

A few times, I toggled to use the camera. Once, it was to show a simple physics demonstration which I felt added some badly needed variety.

For future sessions, I intend to project a single window with Chrome is so that I can project the slides using Google Slides in an extended mode. This will also allow me to switch to an online video with ease instead of selecting the window via the Google Meet UI, which might throw up too many options if one has many windows open (which I tend to do). I also intend to use Nearpod to gather some responses from the students.

Collision Simulation

I created this post here to bookmark some useful tools for use during my upcoming JC1 lectures on Dynamics.

This is a simulation for collisions that show the momenta before and after collisions. It requires registration after one visit.

A better choice for now could be the EJSS version (created by my ex-colleague Lawrence) which is far more detailed.

I had wanted to build one using GeoGebra and in fact, was halfway through it, but the Covid-19 pandemic has created other areas of work that now take priority.

Prepopulated Free-Response Question in SLS

Now that the newest release of the Student Learning Space (SLS) is live, I was keen to try out the new features. Having heard that there is an option for prepopulated free response questions where students can draw on prefilled images, I was eager to test it.

I have made a video of the creation, student attempt and teacher feedback stages for any teacher (only for Singapore schools, though) who is keen to learn.

Two Body Problems in Dynamics

Problems involving two bodies moving together usually involve asking for the magnitude of the force between the two.

For example:

A 1.0 kg and a 2.0 kg box are touching each other. A 12 N horizontal force is applied to the 2.0 kg box in order to accelerate both boxes across the floor. Ignoring friction, determine:

(a) the acceleration of the boxes, and

(b) the force acting between the boxes.

To solve for (b) requires an understanding that the free-body diagram of the 1.0 kg box can be considered independently as only the force acting between the two boxes contributes to its acceleration since it is the only force acting on it in the horizontal direction.

This interactive app allows for students to visualise the forces acting on the boxes separately as well as a single system.

The codes for embedding into SLS:

<iframe scrolling="no" title="Two Mass Problem" src="https://www.geogebra.org/material/iframe/id/fh5pwc37/width/638/height/478/border/888888/sfsb/true/smb/false/stb/false/stbh/false/ai/false/asb/false/sri/false/rc/false/ld/false/sdz/false/ctl/false" width="638px" height="478px" style="border:0px;"> </iframe>

Measuring Difference in Drop Time Using PhyPhox

In a recent class on Kinematics, I prepared a string of 4 pendulum balls, each separated about 20 cm apart and dropped them from a height. Before that, I got students to predict whether the intervals in time between drops will be constant, increasing or decreasing.

Most students are able to predict rightly that the intervals will be decreasing and explain their reasoning.

What challenged me was this: previously, we had to listen to the intervals of sound to verify the answer. I had tried using laptop software such as Audacity to record the sound before. However, I wanted students to be involved in this verification process. PhyPhox enabled that.

With each student being able to download the mobile app into their phones, all I needed to do was to ensure everyone uses the correct setting: the Audio Scope setting and to change their range to the maximum duration (500 ms). They then had to be familiar with the play and pause button so they can stop the measurement in time to see the waveform.

I then did a countdown before dropping the balls. This is an example of the graph obtained.

Through this graph, you can see that:

  1. the time interval between drops decreases as the balls dropping over a larger height had gained more velocity by the time they reach the table.
  2. the amplitude of sound increases as the balls drop with increasing velocity, therefore hitting the table with larger force.