Swift Creatives get… creative with how we could augment our homes
Augmented reality in the smart home, outside the bounds of smartphone screens and smartglasses, could make our interactions with technology more social, more playful and more useful. Add depth sensors, projectors and voice interaction into the mix and suddenly every object in our homes could be enhanced with virtual features.
That’s what Swift Creatives, a design and innovation studio, has been up to with its recent batch of AR concepts. Matthew Cockerill, creative director and head of studio for Swift Creatives London, told us that the project came out of work they did for HTC last year and the limitations the team saw in Apple’s ARKit and Google’s ARCore. The concepts below, he says, are “one to two years away”.
“We fill our houses with furniture that’s 100 years old and high tech TVs,” he says. “People want a rich, warm environment that allows them to have rich experiences with the people in their homes, that’s why AR is exciting for us.”
Read this: Smart home tours – A connected Lake District holiday bungalow
Cockerill and his team built prototypes for each of the four augmented reality concepts, but he notes that in the videos, some of the actions are based on real data from sensors and some are simulated, to save time and money. The next stage that Swift Creatives is moving onto this year is figuring out what can be achieved with existing platforms including ARKit 2.0, ARCore and Microsoft HoloLens.
We asked Cockerill to talk us through each of the AR smart home ideas.
The connected worktop
“There have been lots of these concepts where they will project the steps of how to make the bread onto the table,” says Cockerill. “And in a way, consumers don’t need that because you can get that valuable experience with an iPad that you’ve already got. So you don’t need to buy a new system. But actually what augmented reality means is that the system intimately knows the 3D environment.
“What we’ve done with these prototypes is look at how the system intimately understands the topology. So when you’re pouring flour, it’s actually understanding the profile of that flour and so it can calculate the volume of how much is there.
“So it’s actually allowing you to free pour flour onto a table until the system tells you that there’s enough flour for your recipe. That gives you something that a smartphone can’t do.
“I’d imagine connecting it to Alexa. We didn’t do that with the prototypes but you could say, ‘Alexa, I want to make some bread.’ Alexa might say, ‘OK let’s measure out the flour.’ You look at our visuals, a dot appears on the table and it says, ‘Start pouring now’.“
The flight tracking globe
“If I have a globe at home, I could ask Alexa ‘Alexa, where’s Jane’s flight right now?’ says Cockerill. “Then an AR digital picture of an aeroplane flies around the globe and starts hovering where the plane actually is in relation to the world. It’s totally doable. The great thing is, that digital information is already there in APIs for flights. This is about having digital more connected to our physical objects.
“If you rotate the globe, it shows you the estimated time of arrival because spinning a globe around is obviously like fast-forwarding things, If you rotate it around, you can see a dotted line to where it’s going to land and how many hours it is from that destination. It’s not only about showing data on an object that’s relevant so an aircraft on a globe, but the fact that you can interact with that physical object and the data can change. There’s really interesting areas to explore around manipulating physical objects and having digital data update.
“Then there’s the idea of metadata. We think in a way this globe represents the idea that physical objects have this digital metadata associated with them. So the sell by date on food in the fridge, an eBay bid on objects I’m selling or my globe, all have digital data connected to them. And when you start looking at the world like that, you could think about – when I look at my car, what digital overlay could I do on physical elements of my car? As designers it opens up a whole world of possibilities for us which we’re quite excited about.”
The snowfall that sees your ornaments
“What we’re trying to do is use technology to assist in your daily life – from that Hygge, very Scandinavian perspective, which is the origins of our consultancy,” says Cockerill.
“What we’re trying to do is create a more delightful and useful life in the home. We don’t want to be focused on tech. The snowfall, for instance, it’s a bit of a cliché I guess but you could consider it almost a lava lamp, having low level interesting lighting in your home. But it recognises things like pictures and objects so it can exclude those from the snowfall to give you more of a sense that it’s integrated with that environment.
“Right now I could create some snowfall and it would project onto a room but it would just project onto everything and you’d lose that illusion. What our system can do is take the picture and understand that there’s a horizontal edge and start building up a snow pile onto it.”
The ambient baby mobile
“The way we conceived these was to show how these experiences could be valuable through the day. So from the morning in the baby’s room, through to cooking, through to socialising and just relaxing in the evening.
“What you see in the videos is one to two years out because of two issues,” he explains. “There’s the latency. If I have an object then I move it and Apple ARKit at the moment can’t update quickly enough to provide that experience.
“And the other thing is about trying to track lots of things so the mobile, for instance, where it’s all moving and we map the physical mobile to those shadows. There’s literally too many of those elements moving at once to do that. It’s about tracking and mapping and it’s about latency.”