You may have noticed in one of the recent iOS updates the inclusion of ‘Live Text’ in your photos. This is a really cool addition that allows your camera to detect text within an image, enabling you to use the text just like you would with text on a website or a message on your phone/ iPad i.e. copy and paste it to different apps.  To find this, just tap the little icon that should appear in the bottom corner of any of your photos (see below.)

A nice little additional benefit to this feature comes to light when you also activate your iOS accessibility feature ‘spoken content’. This hidden gem gives you an additional menu option when you highlight the text on your screen to ‘speak’. So not only can you use the text you have captured, you can listen to it read aloud.

We’re not sure why it isn’t a standard menu item within iOS anyway, as there are so many instances in which it is incredibly handy but, alas, we must rummage in our settings to turn it on – it then stays on forever and only appears when you need it.

Thinking specifically about a school setting, this could be the difference between completing a quick maths challenge at the start of the lesson independently (and at the same speed as your friends), and waiting until the teacher or TA is available to read the questions to you before you can then do the actual work. It may be that you can read the words yourself but find it difficult to think about what they mean in that context when it’s all in your head. The ability to listen to it read aloud could be that difference maker that you need.

It also works with handwriting, so we can interact with what the teacher has written on the interactive whiteboard either by taking a photograph of the board itself or via screen sharing (more on this in a later post).

If you would like any advice or support in using these features please feel free to get in touch!

#domorewithtechnology