Scots researchers make phone that uses AI to detect different surfaces

Smartphones could soon be able to carry out a range of tasks just by recognising what surface they are sitting on, thanks to pioneers from the University of St Andrews.

This article contains affiliate links. We may earn a small commission on items purchased through this article, but that does not affect our editorial judgement.

The software could prevent awkward phone calls. Picture: Getty Images/iStockphotoThe software could prevent awkward phone calls. Picture: Getty Images/iStockphoto
The software could prevent awkward phone calls. Picture: Getty Images/iStockphoto

The team’s system, known as SpeCam, needs no additional hardware as it uses the phone’s screen to flash a series of colours when placed face down on a surface. The light bounces back to the handset’s camera, where the image data is analysed to detect the type of material.

• READ MORE: Technology news

Depending on the surface, the phone could be programmed to perform different functions. For example, if placed on a coffee table, it could play dance music, changing to jazz when moved to a sofa.

Hide Ad
Hide Ad

Lead researcher Professor Aaron Quigley, chair of human-computer interaction at the University of St Andrews’ school of computer science, told The Scotsman: “The surface of the world around you becomes a canvas for your imagination.

“For example, you can set it to change the music and light settings when placed on a bedside table, or play a cooking programme when it’s moved to the kitchen table. You can customise it very simply and it’s very discreet, so if a phone call comes in while you’re having dinner, it can send a message back saying you’re busy. There’s a need for a lot more polite technology than we have right now.”

“We’re helping you be mindful so your phone doesn’t embarrass you – but if you don’t want it to help, you just place it face up.”

The system needs a period of training, using machine learning, to teach it different surfaces, and if the phone is placed on an unfamiliar material, such as a kitchen worktop or bedside table, it will ask the user where it is to help it become “location aware”.

Quigley added: “I’m most proud of the fact that we worked with a colleague who has a background in physics and we did a benchmarking against a very expensive spectrometer – we performed better than this device that costs thousands of pounds.

“That’s because a spectrometer uses a tiny point of light but the phone uses the whole screen so can take in more surface data. It’s like looking through a keyhole compared with looking through a window. SpeCam can tell the difference between a ceramic plate and a porcelain one.”

He said the software – which has been showcased at an international conference on human-computer interaction in Vienna – could eventually become part of a smartphone’s operating system, but in the meantime he is keen for developers to take the idea and run with it.

Hide Ad
Hide Ad

“It’s an idea that we want to give out and let lots of people think about,” Quigley said.

“If developers look at this and can see a clear path to applications in health, or even music players, you could have it on phones by Christmas, no problem. Getting it into the actual operating systems is more of a two-year process.”

Hui Yeo, whose PhD research at St Andrews led to SpeCam, added: “The goal is to explore novel forms of interaction for one-handed use which can leverage what people already do with devices, without adding new hardware.”