Olivia Brookhouse

Olivia Brookhouse

Olivia completed her Erasmus internship in 2019 with the Marketing, & Communication team at Telefónica Tech.
AI & Data
Can Artificial Intelligence understand emotions?
When John McCarthy and Marvin Minsky founded Artificial Intelligence in 1956, they were amazed how a machine could perform incredibly difficult puzzles quicker than humans. However, it turns out that teaching Artificial Intelligence to win a chess match is actually quite easy. What would present challenges would be teaching a machine what emotions are and how to replicate them. “We have now accepted after 60 years of AI that the things we originally thought were easy, are actually very hard and what we thought was hard, like playing chess, is very easy” Alan Winfield, Professor of robotics at UWE, Bristol, Social and emotional intelligence come almost automatically to humans; we react on instinct. Whilst some of us are more perceptive than others, we can easily interpret the emotions and feelings of those around us. This base level intelligence, which we were partly born with and partly have learnt, tells us how to behave in certain scenarios. So, can this automatic understanding be taught to a machine? Emotion Artificial Intelligence (Emotion AI) Although the name may throw you off, Emotion AI does not refer to a weeping computer who has had a bad week. Emotion AI, also known as Affective Computing dates back to 1995 and refers to the branch of Artificial intelligence which aims to process, understand, and even replicate human emotions. Photo: Lidya Nada / Unsplash The technology aims to improve natural communication between man and machine to create an AI that communicates in a more authentic way. If AI can gain emotional intelligence maybe it can also replicate those emotions. How can [a machine] effectively communicate information if it doesn’t know your emotional state, if it doesn’t know how you’re feeling, it doesn’t know how you’re going to respond to specific content” Javier Hernández, research scientist with the Affective Computing Group at the MIT Media Lab, In 2009, Rana el Kaliouby, and Picard founded Affectiva, an emotion AI company based in Boston, which specializes in automotive AI and advertising research. With customer’s consent, the user's camera captures their reactions while watching an advertisement. Using “Multimodal emotion AI”, which analyses facial expression, speech, and body language, they can gain a complete insight into the individual’s mood. Their 90% accuracy levels are thanks to their diverse test-sets of 6 million faces from 87 different countries used to train deep learning algorithms. From a diverse data set, the AI will learn which metrics of body language and speech patterns coincide with difference emotions and thoughts. As with humans, machines can produce more accurate insights into our emotions from video and speech than just text. Sentiment analysis or opinion mining Sentiment analysis or opinion mining, a sub field of Natural Language Processing is the process of algorithmically identifying and categorizing opinions expressed in text to determine the user’s attitude toward the subject. This use case can be applied in many sectors such as Think tanks, Call centres, Telemedicine, Sales, and Advertising to take communication to the next level. Whilst AI might be able to categorize what we say into positive or negative boxes, does it truly understand how we feel or the sub text beneath? Even as humans we miss cultural references, sarcasm and nuance in language which completely alter the meaning and therefore the emotions displayed. Sometimes it is the things we leave out and don't say which can also imply how we are feeling. AI is not sophisticated enough to understand this subtext and many doubt if it ever will. Can AI show emotion? In many of these use cases, such as Telemedicine Chatbots and Call Centre virtual assistants, companies are investigating the development of Emotion AI to not only understand customers emotions but to improve how these platforms individually respond. Photo: Domingo Alvarez / Unsplash Being able to simulate human like emotions gives these platforms and services more authenticity. But is this a true display of emotion? AI and neuroscience researchers agree that current forms of AI cannot have their own emotions, but they can mimic emotion, such as empathy. Synthetic speech also helps reduce the robotic like tone many of these services operate with and emit more realistic emotion. Tacotron 2 by google is transforming the field to simulate humanlike artificial voices. So, if machines, in many cases, can understand how we feel and produce a helpful, even ‘caring’ response, are they emotionally intelligent? There is much debate within this field if a simulation of emotion demonstrates true understanding or is still artificial. Functionalism argues that if we simulate emotional intelligence then, by definition, AI is emotionally intelligent. But experts question whether the machine truly “understands” the message they are delivering and therefore a simulation isn't a reflection that the machine is actually emotionally intelligent. Artificial General Intelligence Developing an Artificial General Intelligence, which possesses a deeper level of understanding is how many experts believe machines can one day experience emotions as we do. Artificial General Intelligence (AGI) opposed to Narrow intelligence refers to the ability of computers to carry out many different activities, like humans. Artificial Narrow intelligence as the name suggests aims to complete individual tasks but with a high degree of efficiency and accuracy. Photo: TengyArt / Unsplash When we talk about emotional and social intelligence, forms of intelligence which are not necessarily related to a set task or goal, these fall under Artificial General Intelligence. AGI aims to replicate our qualities which to us, seem automatic. They are not tied to an end goal, we do them just because we do. Conclusions We are still many years behind having an Artificial General Intelligence capable of replicating every action we can perform, especially those qualities which we consider most human, such as emotions. Emotions are inherently difficult to read and there is often a disconnect between what people say they feel and what they actually feel. A machine may never get to this level of understanding but who is to say how we process emotions is the only way. How we interpret each other’s emotions is full of bias and opinion, so maybe AI can help us get straight to the point when it comes to our emotions. Featured photo: rawpixel.com on Freepik
May 23, 2023
AI & Data
What to expect from Artificial Intelligence in 2020?
At the end of 2019 we launched our new Twitter campaign, #LUCAtothefuture to explore various Artificial Intelligence technologies to see how they will advance this year and beyond. It is not hard to predict that 2020 will be a big year for AI, just as previous years have been. It was accepted as a necessity by many industries in 2019 but we are starting to see its incorporation into every area of our lives. AI is no longer a disruptive technology, but instead the basis for success. In 2020 we will see AI start to become prevalent not only in the business world but also in our day to day lives, if it hasn’t already. On our poll on Twitter, we asked you to rank the fields in which you would like to see developments in Artificial Intelligence: Telemedicine Telemedicine refers to the practice of caring for patients remotely using using various connected services. The term is also used to refer to the use of advanced AI and Machine learning software to diagnose diseases. Telemedicine is vital to provide quick diagnosis and treatment when face-to-face communication is not available. Just a few days ago, Googles DeepMind AI software was able to identify breast cancer more accurately than radiologists proving that doctors must start incorporating these Artificial Intelligence technologies into their practices. Companies like Gyant are developing an AI medical chatbot which, with the use of patients records, can provide appropriate care advice with very good results (patient satisfaction rate 4.9 stars). Read our post about Telemedine in more detail. What should we expect in 2020? Telemedicine reaching more vulnerable communities, replacing in person visits Wider use of video conferencing in developed countries Improvement of AI medical advice chatbots Introduction of virtual chronic disease management with the development of hospital-at-home devices. Delivery Drones Whilst military spending will probably remain the main contributor to drone spending, many diverse industries are beginning to realise the usefulness of IOT drones, from stock management, to photography to pizza delivery. Delivery drones, also referred to as UAVs (Unmanned Ariel vehicles,) are disrupting the logistics market and replacing traditional transport methods. The technology can also be used to transport medicine and supplies to dangerous areas. Amazon announced at its AI conference in June 2019 that drone deliveries would increase the efficiency of its delivery service enormously and that they expect to roll out Amazon Air fully in 2020. We should expect other major companies to follow closely behind. What should we expect in 2020? Wide use of delivery drones by big companies to deliver packages to homes, particularly in the US Improvement in drones' range to reach remote areas Increased regulatory scrutiny Autonomous vehicles We've been expecting the release of autonomous cars for many years now, but how close are we to it becoming a reality? The technology is thus far restricted to specific manoeuvres such as steering, acceleration and deceleration and any other automation of vehicles can only be conducted on closed campuses. The main obstacle to many companies is improving the accuracy of perception systems, allowing vehicles to understand its environment. Tesla this year acquired a specific object identification AI start-up called Deep Scale to help move from level 2 automation to level 3 where vehicles are completely responsible for monitoring the driving environment. Elon musk commented that their cars are “able to drive from one’s house to work, most likely without interventions, but with supervision. Read our post about autonomous vehicles in more detail What should we expect in 2020? Increase in testing hours to develop perception machine learning systems. Development from level 2 automation (Vehicle assistance automation) to level 3 and 4 (self-driving vehicles) Increased discussion of liability issues Robotic assistants, physical and virtual Although some Sci-fi films in the last 20 years have presented their negative vision of robotics, in reality, robotic assistanc in both physical and virtual terms can help us in many areas of our lives. Japan have announced that robots will play a major role in the 2020 Tokyo Olympics; some will be used to collect and distribute sporting equipment; others will allow individuals to attend the event virtually. Over recent years we have started to see the incorporation of AI powered smart speakers into our homes all primarily voice activated. Also, AI chatbots are improving customer experiences for many industries. This delivers quick and intelligent responses, reducing the workload for customer service handlers. Telefónica, whose Artificial Intelligence, Aura, is present in 8 countries with more than 1,300 use cases, offers personalized and immediate responses to its customers within their homes. The area of robotics is growing at an exceptional rate, becoming more intelligent every day but companies are mainly relying on Artificial Narrow intelligence to perform specific tasks. Whilst Artificial general Intelligence remains in its infancy, some experts believe we are still at least 50 years away from human-like robotics. https://www.youtube.com/watch?v=Ijc1OjZaglo At CES 2020 in Las Vegas, Samsung presented their new home robotic helper What should we expect in 2020? Increased focus on Artificial General Intelligence Development of physical home assistants by Big Tech to perform routine tasks Workplace digital assistants to perform routine tasks 50% of searches performed with voice Intelligent chatbots assiting customer service roles The next decade promises exciting things for the development of AI, in all industries and in every part of our lives. Follow us on Twitter to make sure you don’t miss anything our series of #LUCAtothefuture. Read about the ethical issues which accompany the development of many of these technologies. To stay up to date with LUCA, visit our Webpage, subscribe to LUCA Data Speaks and follow us on Twitter, LinkedIn o YouTube.
January 10, 2020
AI & Data
The Big Data behind Black Friday
It is often the products that seem to have “come out of nowhere” that suddenly experience rocket sales and have everyone talking. Whether its health fads such as kombucha, trainers as heavy as bricks labelled the ‘chunky trainer’ or shapewear designed to help you resemble a Kardashian, we've seen it all. So how do companies know what's going to sell and how to sell it? With large supply chains, companies must be able to prove the future demand and therefore success of a product before it goes to market, and this is where Big Data can play an increasingly bigger role in predicting consumer behaviour. This is particularly important for the biggest shopping day of the year, Black Friday. What is Black Friday? The first record of the phenomenon of Black Friday began in Philadelphia in the US in 1952, the day after Thanksgiving. It has since evolved into the biggest shopping day of the year worldwide. In the UK in 2018, 64% of the population bought something on Black Friday equating to 42.5 million people and in Spain, 55.7% of the population participated, totalling 26 million people. The top-selling products on Black Friday are clothes, cosmetics, jewellery, shoes and electronics. With the incentive of getting a bargain, spending on Black Friday last year was on average £315 in the UK and €210 in Spain per person. However, the UK is providing better deals; with an average discount of 63% per product compared to 47% in Spain. Whilst Black Friday began on local highstreets, the use of brick and mortar stores has reduced significantly in recent years. On Black Friday in the UK in 2018, 64% of shoppers made purchases both offline and online, 24% just online and 12% just in shops. In Spain, 43.4% of shoppers made purchases both offline and online, 40.22% just online and 16.3% just in shops. Brands have had to react as consumer habits have changed. Because of the internet, the 21st century consumer is less loyal to specific brands and led primarily by price. Due to the availability of information online, consumers are more equipped than ever before to compare products across 100's of brands. Because of this, platforms such as Amazon and ASOS are able to dominate in their respective markets because their platforms are able to showcase many cheap comparable brands. How does Big Data predict consumer trends? Previous marketing strategies included monitoring social media or analysing surveys without the ability to store, combine or optimize these data sources together. Big Data and AI allows companies to harness data from thousands of sources that can predict future purchase habits with a high degree of accuracy. Algorithms are applied to find patterns within the mass of data, to provide insights which inform both internal decision making and improve the customer experience. This means companies are able to harness data from years of Black Friday sales to accurately select products and discount rates which will attract the masses. Machine learning software is built within recommender systems on websites which, based on customers previous purchases, can predit what customers will want to buy next, offering them more in-tune suggestions. Internal decision making is no longer based on innacurate surveys on small focus groups but instead on human behaviour in real time. The old methods that were invented before the digital era are not agile, precise or predictive enough Tim Warner: Pepsico It is vital that large companies are able to predict the popularity and therefore profitability of new products because large supply chains restrict them from introducing new products quickly. For many, this means deciding Christmas product ranges 9 months before the holiday season has started. Start Ups like Black Swan are disrupting the way ordinary market research is done, analysing consumer purchase behaviour on a large scale to predict consumer trends for their clients. Unlike before, social media permits the mass spread of consumer trends promoted by influencers and celebrities alike. Some of the most common trends at the moment follow big social movements such as Veganism and Sustainability and there will definitely be an increase in related products this Christmas. Big Data for Advertising Whilst Big Data is vital to provide insights to design product ranges, it can also inform Advertising and Sales strategy. At LUCA we help our clients, with the use of anonymised and aggregated data to optimize their decisions and save costs. LUCA Business Insights solutions help locate prime locations for shops and advertisements to target the intended audience. LUCA can also provide insights within the retail location in order to optimize merchandise and displays to attract the desired client. This is particularly important at Christmas where many stores will compete to have the best window displays so companies must be showcasing the correct products for each location. By identifying how people behave in and around the retail points, brands can optimize their offline services. https://www.youtube.com/watch?v=e5BMTDm3ffk To stay up to date with LUCA, visit our Webpage, contact us and follow us on Twitter, LinkedIn o YouTube.
November 29, 2019
AI & Data
Artificial Intelligence, doing more than humanly possible since the Greeks
Many of those working in the world of Artificial Intelligence acknowledge the 1950's as the birth of concepts such as Machine Intelligence and Artificial Intelligence with the creation of the ‘Turing Test’ by Alan Turing. In a previous blog post we talk about the importance of these early developments to the progress of machine learning technologies now. Whilst these years mark the start of AI as we know it today, ideas about creating artificial life were imaginable long before technology was even invented. Throughout history new developments have been vilified for "going too far" and adding unnecessary complications to our lives. However, in many writings from ancient times, we can see that humans have always dreamed of pushing the limits of nature, from what is humanly and biologically possible. Maybe it's within our nature to surpass it. Ancient examples Ideas of artificial life feature in many ancient texts from Ancient China, Hinduism and Ancient Greece. Dr Adrienne Mayor, a Historian of Science, has studied many examples of how ancient civilisations envisaged the concept of technology. The etymology of Biotechnology is the word Biotecnin from Ancient Greece. It means to be made by craft and not formed naturally. They imagined, even in a pre-industrial society, the idea of human like technology. To bend the rules of nature long before it was possible. The Odyssey is a 12,000-line poem spanning years of Ancient Greek history, myth and legend, written in the 8th century BCE. The author was, in theory, a blind man Homer, although nobody knows if he was real or legendary. Although the stories have altered and changed over time many elements still stand out, envisaging how to imitate, augment and surpass nature. Some examples include ships guided by “thought alone”, presenting an ancient vision of driverless cars. Hephaestus, the working god makes automatic gates, self-moving carts, eagle like drones, crews of golden servants who had reason, mind and strength beyond humans. They possessed all the divine knowledge of the gods, imitating a large data system. You can find similar stories of autonomous vehicles and artificial life in Hindu myths. No single civilisation had a monopoly on ancient dreams of advanced technology. Whether one looks at Greek, Etruscan, Egyptian, Hindu, Islamic, Chinese, or any other ancient culture. Myths about artificial life all contemplate what wonders are possible if only one could possess the divine creativity and abilities of the gods. A research scholar at Stanford University, Adrienne Mayor Can bots write Greek Plays? Whilst the Greeks dreamed of replicating AI, AI is yet to perfectly replicate Ancient Greek culture. Spencer A.Klaven tweeted a 2 page play his AI system created after supplying data from 1,000 hours of Greek tragedies. It produced an amusing result. @spencerKlavan on Twitter Whilst this bot was less successful, other companies are employing the use of AI to write creative content. Using natural language generation, AI systems can replicate writing styles to produce poetry, social media posts or even financial reports. The technology was used to create an almost award winning novel, The Day a Computer Writes a Novel. Scarily accurate are bots at replicating content that it is often unnoticeable who or what has written it. This Website allows you to guess whether a human or bot has written a poem.
November 14, 2019