Yesterday I had my first initial look at the new Tobii Dynavox TD Pilot eye gaze iPad device. The team at access: technology spent a couple of hours playing with the TD Pilot whilst our local TD rep, Joe, did our grand job of fielding our many questions.

Firstly, I must apologise for taking so few pictures/screenshots – I hadn’t intended to write this up.

It’s here

Some TL;DR info:

  • Price £6,990 + VAT
  • The device comes as an all-in-one, standalone device with the iPad Pro 256GB – the camera, case etc. are not sold separately (it turns out for good reason)
  • There are (currently) no loan options in the UK, trials are single sessions with a rep visiting – although I believe the TD UK Sales team are on this and that some of the AAC hubs may soon/already have access to an assessment device
  • There is a single charge cable for the iPad + Pilot
  • The case is modular to support future iPad upgrades, regardless of size changes
  • The rear screen only works with TD apps
  • The case has 2 switch ports
  • The iPad is ‘managed’ to enable the Pilot to be setup in advance, but completely unlocked for own user account
  • The camera controls the iPad nativitely through Apple’s assistive touch accessibility feature, although the TD CoPilot app is necessary to calibrate the camera.
It just works...better - Tom McCallum

First impressions, it’s small – more neat and compact than I imagined. Joe unceremoniously plonked it down on the desk in front of him, and despite his seated positioning not being eye gaze ‘perfect’ and with sunlight streaming across the screen from a window, it…just works (I don’t think he even had to calibrate it) – very encouraging.

Device

The device has a Rehadapt mounting plate on the back and sits nicely on a desk, with an ingenious small (but strong) kickstand to adjust the angle of the Pilot if required. The iPad comes ready set up; connected to the Pilot as an eye gaze device along with the speakers and rear screen, which I suspect is why it has to be sold as an all-in-one ‘managed’ device. I don’t know how this might affect other MDM uses or if there are any privacy concerns regarding this – but I can appreciate this as a sensible consideration; the user error and support requirements involved in having users piece this together themselves with their own iPad would be complicated to manage. On that, I forgot to ask if any kind of remote support access to the device is possible – I suspect not, but I hope I’m wrong.

Calibration

Calibration is performed through the TD CoPilot app. Using some sort of wizardry, a tiny rear button can launch the app at any time to check the eye gaze ‘track status’ (i.e. making sure the camera can see your eyes). As far as I could tell, only one calibration profile is possible as the calibration is presumably stored on the camera itself (like the Hiru), as no software is required to work the Pilot. Typically one device for one person has felt the proposed implementation for other iPad experiences, and this feels about the same here. I understand that camera firmware upgrades are performed through the CoPilot app.

Calibration is typical TD using 2, 5 or 9 points, with exploding little dots much like the 4C experience on Windows 10. Currently, exploding tiny dots are the only option. It then launches a 2nd window with 9 circles to test your profile. Initially, my calibration was terrible, and I had an impending feeling of disappointment (I’ve had this with other new eye gaze products that weren’t quite ready for market); however, it turns out I had just forgotten to hit save! I was using a colleague’s previous profile, so watch out for that.

Once I figured out this user error, it was surprisingly good. I had immediate very-accurate control of the ‘cursor’, which is by default a little round grey dot, configurable to have different sizes or a coloured outline. Selection is achieved by default through a timed dwell – set a 0.75s on Joe’s, which is fairly quick, but felt comfortable and no errors/miss-hits. It helps that the iPad is an incredibly familiar device, so not much visual hunting for items is required. I don’t think there is a pause eye gaze option for normal iPad operation, other than turning it Assistive Touch off.

Typically eye gaze is a suitable option for users with MND or spinal injuries, as they tend to be very still. Most of our clients have Cerebral Palsy which can present as uncontrolled or uncoordinated dystonic movements, which can affect eye gaze accuracy. However, a quick test of moving my head position did not seem to affect the ability of the Pilot to capture my eye movements. More experience is needed in the field with clients (I’ll report back), but certainly encouraging. I don’t know if the iPad camera is also used to detect head location (?), but the pilot seemed to do a good job of allowing me to reposition myself whilst using the device.

Apple’s Accessibility

I’ll discuss Apple’s accessibility before looking at the TD-only options, as much of what makes this device special is baked into the operating system itself. The Assistive Touch options here are incredibly well thought out – as are all of the iPad’s accessibility features. Apple’s accessibility considerations really do seem 20 years ahead of the other major computing manufactures – head tracking on the MacBook and native switch control over the whole device baked into macOS/iOS without any software required, to name just two examples.

The iPad can be woken up by the Pilot (wake on gaze) and then securely unlocked using iPad Pro face recognition (using the iPad camera) – Face ID can also be configured to allow for App Store purchases (without the 2 taps on the power button); however, an additional menu item is required to be set up Apple Pay for this to work in the Assistive Touch options.

Assistive Touch presents as a transparent grey box permanently on the screen (some users may recognise this as a feature they implemented when their old iPhone home button broke!). This icon can be moved wherever you require on the screen. Gazing at this brings up a configurable menu of 8 options; typically, ‘home’ is included along with gestures and programmable shortcuts. One new option is the fall-back option, where a user can choose if the device falls back to the default operation of a tap or not after a ‘special operation (gesture or recipe). This is useful for games that require a consistent swipe recipe for example so that the user can keep swiping of every dwell.

Here it becomes immediately apparent, especially as the menu is called Assistive Touch, that everything that follows is to emulate touch access. The iPad is, of course, predominantly designed for this method of access and every app is typically best used with touch – although keyboard and mouse access is now an option. This touch emulation requires that the user understands what the touch process looks like. Using a two-finger gesture to zoom, for example, is potentially meaningless to someone who has never used their hands to operate a device; therefore, I would suggest that use of the full iPad functions requires a good cognitive understanding of what and why certain gestures are being emulated (the same consideration is necessary for full switch control) – although, for access to most functions, gestures are not necessary.

Within the Assistive Touch options on iOS 15, the dwell invisible ‘window’ size is configurable, to accommodate varying amounts of dwell ‘wobble’. Also specifically for eye gaze users is hot corners – users can configure the top-left to be the home ‘button’, for example. Other device options exist like volume and the eye gaze ‘snaps’ to these areas. Note, this does affect the use of some apps which may have small options in the corners – I didn’t spot a hot corner on/off toggle within the Assistive Touch menu, but it wouldn’t surprise me if Apple has included one somewhere.

With little effort, the entire iPad is now fully accessible, with no hacky workarounds or additional software requirements (post-calibration), which, when I sit back and consider, is truly amazing. Suddenly this feels like I’m again using an AT ‘game-changer’ product – an accolade the iPad already achieved for many AAC users.

Tobii Apps

Tobii Dynavox has also released TD Talk, a lovely simple predictable text-only AAC app (with the glaring omission of a copy-text button…please add this!) However, what makes this special is that once launched, the Assistive Touch menu disappears, and the user no longer has to emulate touch access – this is designed for eye gaze users. The learned prediction and pre-stored phrases are stored in a user’s iCloud account and will transition to other devices for future upgrades or repairs.

This uses the same calibration profile (as I presume it’s stored on the camera itself) but has its own dwell settings to be set up. Presumably, like the Hiru+Predictable iPad eye gaze option, the app communicates directly with the camera, not requiring Apple’s iOS options. The keyboard glows nicely once it is dwelled upon and the Pilot displays a ‘loading’ dot sequence on the rear screen whilst the user is writing – a wonderful feature. The rear screen can also display the text spoken or as it’s being composed. It’s a complete mystery to me how Tobii Dynavox has got this rear screen to work on an iPad, but it’s great.

A quick gaze off-screen enables a lower menu to appear to control TD Talk settings and turn Assistive Touch back on to exit the app and do other iPad functions. Currently, there is a long-winded workaround to copy-text from TD Talk to other apps using Assistive Touch’s menu items, so it is possible.

TD Snap has also been updated to work directly with the camera, with non-emulated cursor access and has a programmable ‘home’ option as one of the cells. Again dwell time needs to be set within the app itself. The TD Snap App warrants plenty to discuss itself, especially the built-in Google Assistant that allows for smart home control without a Google Home Hub – but that’s for another time…

Augmentation

At access: technology, we often augment a client’s device with a separate wireless keyboard and trackpad as this enable TAs to support access to written work, without leaning over a user or moving their device – we must have purchased over 50 Logitech K400s for clients. For eye gaze, these cheap little keyboards are perfect for supporting a calibration step-through without bringing hands over the device within the user’s field of vision. The K400 will work (using a USB adaptor) with an iPad, but the Pilot has no additional access to the USB port. The Bluetooth equivalent is the K830 (annoying out of stock in the UK, but I have a couple in the stock cupboard), so I was curious to see if a TA or carer can support a user navigating the Pilot or entering text, without needing to lean over them. Once the K830 was paired, a 2nd smaller cursor dot appears and functions as expected – however, the eye gaze cursor and Bluetooth cursor do continuously visually ‘fight’ with each other, but it is possible to effectively support someone using this device – especially with typing.

Logitech K830 Illuminated Living-Room Keyboard with Built-in Touchpad –  Easy-access Media Keys and Shortcut Keys for Windows or Android :  Amazon.co.uk: Computers & Accessories

Plugging in a switch just worked. I was able to gaze and select with the switch – an incredibly fast access method, even whilst dwell was still active and also supported by the K830.

Buddy Button Switch

However, we could not get the standard switch access to work using the switch input. It is possibly unnecessary as you can use a Bluetooth switch input on an base iPad for much less cost, if switch control is your preferred method of access, but it might have been nice to have this as an option. There is the possibility that using the K830 confused the iPad, as the switch click would only perform a cursor click in its last position, so we donated a switch to Joe to investigate in case our fettling inadvertently broke it!

Apple Shortcuts work as expected; Lucy created a flashing light for a classroom “I know the answer” indicator (the equivalent of raising a hand) that can be programmed directly in the Assistive Touch top-level menu for quick access (2 hits) that can alert a teacher using the iPad camera flash. She set this up on Joe’s pilot, so you get to see a demo of this in action.

Final thoughts

An hour after Joe left, I had placed an order. Partly due to a lack of a loan system, but mostly because I felt confident in the product and that it will be suitable for a number of our clients. Typically, with anything new, I would wait until version 2 – so everyone else can figure out any potential issues. However, this feels like a very well designed and considered device; not a ‘beta’ product. I imagine that there have been many developers and staff at Tobii eager to share this for many months – and kudos to them for waiting to release all the functions of iOS eye gaze, proprietary eye gaze within TD apps and the rear screen as one coherent product.

With only one user calibration possible, the TD Pilot is very well suited for a single user; I think it will be hard to use as a shared device within a school. For a single competent eye gaze user, who is capable of grasping that the experience is, for the most part, emulating standard iPad touch access, the TD Pilot feels like a fantastic new option for them to fully embrace independent access. I can’t wait to try one with our clients!