The lockdown due to the COVID-19 pandemic has revealed our dependence on technology. So now, many brands are rethinking how to better connect with those who shop online.
Consumers in the Middle East are increasingly connecting to the Internet to buy food, see each other, train … But, the sale of clothes has “fallen sharply” since the beginning of 2020. Some companies believe it is due to certain online interactions they need a more humane approach.
An idea that Dubai-based software provider Getbee has capitalized on, creating what it calls the ‘first custom platform’ for brands to better connect with customers, who can be advised by an expert they trust.
The CEO of the company, a former professor turned entrepreneur, says that, while global fashion profits declined in 2020, its ‘online’ retail sales increased by 28% on average, in a matter of months, for nearly 20 brands that use their platform.
“Buying online can be lonely. Typically, people purchase from one person individually. 86% of customers prefer to deal with a person rather than a chatbot. That’s why the ‘optional personalized purchase” works, he says. Thea Myhrvold, CEO of Getbee.
Chalhoub Middle East, representing luxury retail clients like Lancôme and Faces, notes that humanizing online platforms is the way to go; especially after the pandemic.
“It helped us survive and it helped us generate results that we would not have achieved if we had waited for our IT specialists to build e-commerce platforms, for us,” says Aleksandra Harciarek, director of digital projects for the Chalhoub Group.
“Given the need to create a more ‘human’ technology to gain our trust, developers believe it necessary to introduce human values into more complex technologies such as artificial intelligence and machine learning, to make them trustworthy,” says the Euronews journalist , Salim Essaid.
This begs the question: can technology lie? The answer is yes’. It all depends on the data entered to guide machine learning. In 2015, Amazon’s Artificial Intelligence search engine determined that resumes with the word “woman” were less highly regarded. An MIT study found in 2018 that automated facial recognition algorithms had higher detection errors towards people of color, compared to whites. Opposition to similar issues in 2020 led Microsoft, IBM and Amazon to refuse to sell their facial analysis software to the US police, fearing racial profiling and mass surveillance.
Despite this, only 25% of companies consider the ethical implications before investing in AI, according to Price Waterhouse Cooper. The firm estimates that the technology will contribute some 13 trillion euros to the global economy in 2030. Start-up DatumCon trains AI neural networks to ensure oil drilling safety, track mask use and social distancing and detect the human emotions with “sentiment analysis. The CEO of the company states that AI is superior to humans in performing specific tasks but needs human guidance for more complex ones, such as emotion detection, which may include biases if not properly guided.
How do you train machine learning to be more humane or more ethical? Salim Essaid wants to know.
“Every time we are going to implement a network … let’s say, here in Dubai, we make sure that the data labeling comes from local citizens because we want to be able to reflect the local characteristics of the data,” explains the CEO. of DatumCon, César Andrés López.
Under the supervision of a psychologist, DatumCon uses 30,000 different faces to teach detection of each basic emotion. All in an effort to create a more ‘human’ technology.