Symposium on Social Robotics Takeaways
Applications for social robotics are growing
We visited the United Arab Emirates to attend the Joint Social Robotics Symposium at the University of the United Arab Emirates and New York University Abu Dhabi. We also visited several research organizations and related groups to gain a better understanding of the research and investment trends in social robotics. During our visit we met with other researchers in robotics and artificial intelligence to learn about the latest research in the field. We also gave a presentation on our view on the state of social robotics and participated in several panels covering a variety of issues related to robotics.
The changing face of social robotics
The symposium was host to some of the world’s leading robotics researchers, executives and thought leaders, and included representatives from a range of organizations, including Fraunhofer, PAL Robotics, IEEE, the United States Military Academy, Oxford University, and Khalifa University, to name a few. The presentation topics covered ranged from robotics to artificial intelligence, digital twin, human-robot collaboration, commercial applications for social robotics, ethics for autonomous robots in war, to sociology, psychology and other disciplines related to the development of robotics technologies. While the views on the evolution and direction of social robotics was as diverse as the attendees and their topics, there were key takeaways and some overlapping views on the future direction of social robotics.
Key takeaways from our trip to the UAE and the JSSR 2019
Overall, current social robotics research is heavily focused on expanding the use cases for human robot interaction (HRI) technologies, robots, and related areas such as eye tracking, predictive learning, and human-machine relations. Most of the research presented was about designing robots and other autonomous machines that are better able to discern what humans need or want and preemptively taking requisite actions to better serve humanity.
The sharpened focus on research directed toward the development of commercial applications is another area borne of lessons learned over the past two years which witnessed the collapse of many social robot companies. In the same period, there were several terminations and scaling back of ‘pure’ social robot projects, including Honda’s (OTCPK:HNDAF) Asimo and Softbank’s (OTCPK:SFTBY) Pepper in lieu of other robots. Mayfield Robotics and Jibo, as well as others such as the collaborative robotics pioneer Rethink Robotics were among the more high-profile companies who went under in 2018. Nevertheless, from these robots, there has emerged a copious amount of data which is now being utilized to develop more advanced, but practical autonomous machines with social capabilities.
The changing face of social robots
Social robotics is evolving to encompass a wider range of applications
The past few years have been witness to many social robotics companies overpromising and under delivering on their robots’ capabilities. Because of this, the first wave of overfunded and underpowered social robot companies that launched products over the past few years have been closing down due to a variety of headwinds, including sluggish sales, technical hurdles, and intense competition from other technologies which outperform in terms of technological capability, price, or both. The emergence and rapid adaptation of the smart speaker is just one example of a technology that did away with many of the first-wave social robots.
But despite the untimely demise of many of the first-wave social robot companies such as Jibo or Mayfield Robotics, other companies with longer-term strategic views on how social robotics are evolving are now entering the fray and appear to be accelerating their social robotics r&d activities. Among the larger, deeper-pocketed players in the space are LG Electronics (OTC:LGEAF), Honda, Toyota Motor (TM), Bosch, and Samsung Electronics (OTC:SSNLF). In the startup space, there is Intuition Robotics’ Ellie Q, Buddy from Blue Robotics, and Furhat Robotics are ones we are watching. And there are the other, more niche social robots such as Sony’s (SNE) Aibo, a reboot of its legendary line of companion dog robots, and Anki’s Vector, which the company is marketing as a companion robot.
While many of the first-wave social robot companies languish, the innovators creating new markets are redefining the space. Funding alone will no longer help these companies to carry on, but their ability to create needed solutions will. Companies making second-wave social robots can learn from the mistakes of the now defunct first- wave social robot companies--failing to deliver social robots or other systems with social robotics technologies with useful applications, functions, or lasting entertainment value. When companies making social robots offer unique problem-solving features or have greater utility compared to alternatives from lower cost IoT devices—smart speakers, tablets, etc., then they will define and lead this nascent market, which is estimated to grow to more than USD500mn by 2022. The growth in consumer social robotics is due largely to burgeoning demand from the senior care market. However, this estimate is conservative and is based on the common view of a social robot, which is a machine that functions in a manner similar to the LG Hub, rather than other, newer robots which have social functions but were previously not considered as robots (e.g. autonomous personal transport vehicles, etc.).
Predictive coding to develop robotics EQ
Robots learning to interpret nuances of human behavior
Predictive coding for social cognitive developments in robots is one of the areas which social robotics researchers are focused. NICT/ Bielefield University is one of the leading organizations in the field of predictive coding social cognitive developments in robots. With predictive coding, robots, like humans, are able to learn the behaviors of individuals and groups through observation so that they are better able to make decisions like humans when interacting with them. This technology will not only be utilized in social robots but in other autonomous machines in the Robotics Internet of Things (RIoT), which includes everything that is integrated with robotics technology such as self-driving cars, which must operate in environments where they need to understand the states of not only the humans onboard but those in the operating environment as well. In the automotive sector, we believe this technology would be useful in robots such as Hyundai Motors’ (OTCPK:HYMTF)(OTCPK:HYMLF) Elevate Ultimate Mobility Vehicle (UMV) or in Continental’s (OTCPK:CTTAY) last mile logistics system, which utilizes autonomous delivery vans and a fleet of quadruped robots.
Predictive coding in robotic systems will make them safer and more efficient when operating in and around humans. Though a learned ability to empathize with people, these autonomous systems that are charged with interacting freely with people will make better, more informed decisions. This is particularly important in areas such self-driving cars as with this technology they will be capable of reading, for example, the face of a child standing next to a road to determine whether to stop or proceed in its current path. At present, humans can relatively easily discern the intentions of other humans—say when evaluating pedestrians at a crosswalk or determining whether a child standing on a sidewalk will run out into the road, etc. With predictive coding robots will, in theory, make decisions which until now was the exclusive purview of human beings and their ability to ‘sense’ and predict a person’s action by looking at their facial gestures, eyes, body position/movement, etc. Beyond self-driving cars, predictive coding will also improve the productivity of other robots as they will better serve humans by only carrying out when desired or needed, and only to a degree to which its actions are useful or wanted, putting an end to the annoying repeated and unwanted actions of many of today’s so-called ‘intelligent’ systems which do not have the capability to ‘understand’ our emotional states. Put another way, predictive coding empowers machines with greater emotional intelligence (EQ).
Robots and the dynamics of emotions in teams
Optimizing human-robot teams optimizes outcomes
Collaborative robots (cobots) are enjoying resounding success in deployment across a growing number of industries. This growth is driving researchers to optimize human-robot collaboration to ensure the greatest productive output from these teams. As cobots are increasingly working alongside humans, there is a growing need to better understand the dynamics of human emotions in these human-robot teams to optimize outcomes. Researchers at Cornell University CIS are exploring how robots impact group processes such as conflict, emotions, power dynamics, and how robots influence processes to alter group outcomes. The researchers are investigating how robots affect these processes and outcomes, which is becoming more crucial as the number of autonomous robots operating in mixed human-robot group settings will continue to grow at an accelerating rate.
In one study, the Cornell researchers developed a collaborative robo-microphone by integrating robotics technology into a microphone for the purpose of understanding how humans interact each other and robots when collaborating in teams with robots. What they discovered was that human behavior changed considerably when a robot is part of a mixed team composed of humans and robots. The research findings indicate that when robotics technology is integrated into an otherwise inanimate object (robotization), the resulting interactions among humans change because of the interactions with the robot. The researchers believe these changes in human behavior will extend to other autonomous systems. The key takeaway is that companies must carefully design the social element when launching autonomous machines that interact with humans—these include ‘social robots’, autonomous vehicles (AVs), and all other elements of the growing RIoT ecosystem, as they will influence the behaviors of humans with whom they interact.
Rethink Robotics, one of the pioneers in the cobot space went bankrupt in 1Q19 due to intense pressure from competitor Teradyne’s Universal Robotics as well as a growing flood of other deeper-pocketed competitors entering the market. In Korea, we the most promising companies in the cobot market are Doosan, Hanwha Techwin with the former aggressively moving into the robotics space and the latter entering into strategic partnerships with companies such as Yujin Robot.
In robotics, there is a growing awareness of the need for greater standardization. Robotic interfaces must tackle more complex tasks and simultaneously be easier to use in order to reduce costs from specialized training and specialized handlers. In line with the development of easier robotic interfaces is taking into consideration that robots are increasingly working closer with humans. Accordingly, it is imperative that these systems are able to respond to users and adapt their behaviors in real time. Over the next five years, researchers need to work on recognizing basic human behaviors and adapting the actions of robots to respond to them. Over the next 10 to 15 years, this should develop into live coding and adapting to the needs of complex tasks.
Social interaction is particularly important to the field of social robotics; however, it is just as relevant for industrial robotics. Eventually, cobots will need to be able to interact with human operators in a dialog. This will involve more complex interaction than the current programming methods allow.
Action and Intention Recognition (AIR)
The key to efficient human interactions with autonomous systems
AIR (action and intention recognition) investigates action and intention recognition in human interaction with autonomous systems. Specifically, the focus is on how humans and autonomous systems interact when moving in shared physical spaces. The dynamic recognition of actions and intentions between humans and autonomous systems when interacting is critical for safety and for public acceptance.
AIR focuses on human interaction with 1) social/assistive robots in the home, 2) autonomous transport vehicles in industrial environments, and 3) with AVs in public traffic. In all of these scenarios, it is imperative that autonomous systems operate in an unobtrusively and transparently with the humans interacting with them, and posses cognitive abilities that allow people to intuitively and effortlessly communicate their intentions and desired actions.
To develop robots that interact naturally and effectively with people therefore requires the creation of systems that can perceive and comprehend intentions in other agents. AIR along with the other disciplines of social robotics are critical for 1) the broad acceptance of autonomous machines operating in society at large; 2) the more efficient and safe operations of these machines in real time; and 3) the development of new applications and businesses which will generate meaningful financial results for stakeholders.
The rise of the digital twin
What is a digital twin?
A digital twin is a virtual version of physical objects or processes; it is a living digital copy that is an exact replica that alters its actions to mirror its real-world counterpart. Basically, the digital and physical worlds are mirrored, and through the use of sensors, systems are monitored and data analyzed, so that users can preemptively resolve issues before they appear in the physical world.
With the digital twin we can replicate almost anything that exists in the physical world in the virtual world. The idea emerged early in the twenty-first century but failed to gain traction due to economic and technological constraints. With the Internet of Things (IoT), the situation has changed and the time has arrived for the explosion of digital twin use cases. And there are many who believe it is now inevitable for everyone to have a personal digital twin.
The case for and implications of the digital twin
Many AI experts hold the view that the experience that we record and engage in online can be utilized to generate a digital twin in the virtual world. Unlike the more conventional notion that a digital twin that exists within an environment populated by commercial inhabitants and products, the new view diverges in several respects such as:
- Our digital twin exists in a ‘cosmic’ digital world.- Our uploading of personal data and daily activities is fueling this trip into the digital world.- Although the uploaded data about us is now distributed in various systems such as social media, financial institutions, healthcare facilities, etc., these will eventually all converge into a single system.- The digital twin will have the ability to access all of the human counterpart’s information for use the digital domain.- Upon death, an individual’s digital twin will continue to exist in the digital domain all of their data will be integrated into an AI system.
Korea’s tech environment is perfect for the digital twin to flourish
For the digital twin business to take off in earnest, not only do we need the right companies with the right technologies (such as 5G) but the right infrastructure, as well as a cultural mindset which is receptive to new technologies must be in place. The cultural, business, and technological climate of Korea is almost perfect for this. Globally, the key players in digital twin market include GE (GE), Microsoft (MSFT), among others. In Korea, Samsung Electronics is exploring how to develop and deploy digital twin technologies. As Samsung has the necessary components within its business ecosystem to capitalize on emerging digital twin opportunities, we believe it is well positioned companies to become a leader in this emerging market. Other companies such as LG Electronics, SK Telecom (NYSE:SKM), KT (NYSE:KT), and LG Uplus are also positioned to benefit greatly from emerging digital twin opportunities.
Autonomous unmanned vehicles and multi-agent systems
Safety and efficiency improve with the use of multi-agent systems
Today autonomous robots are typically mission and platform oriented and frequently used in for missions that are dirty or dangerous to humans as well as search and rescue, patrolling, and surveillance. Although aerial and ground robots differ, their obstacle detection and avoidance methods, path planning and path-tracking algorithms are not dissimilar. The advantages of this approach are that the system is more robust, adaptive and fault tolerant as there is no critical reliance on any specific individual, and that decentralization results in increased reliability, safety and speed of response. Furthermore, distributed approaches are superior, as they do not require the complete preplanning of cooperative strategies. Adaptive solutions can appear when the systems are operating and via interactions among the AVs as well as from the requirements from the objectives and environment, which may not exist in a preplanning stage.
The distributed control multi-agent systems (MAS) approach is used in various autonomous systems including unmanned aerial vehicles (UAVs), unmanned underwater vehicles (UUVs), unmanned water vehicles (UWVs), unmanned ground vehicles (UGVs), search and rescue, collective robotics, social cognition, etc. With such systems, the autonomous agents can rapidly determine optimal solutions for their tasks. The more promising MAS models of social cognition are based on biological collaborative activities such as those of bees or ants.
Developing solutions for AVs of all types that can block attacks such as LiDAR spoofing is where fusion systems outperform. These systems are better able to work around the majority of spoof attacks and more likely to accelerate the deployment of AVs in different scenarios, and in particular self-driving vehicles. LG Electronics partnered with Aeye to commercially manufacture camera-LiDAR systems, called iDAR at scale to enable the rapid deployment of AVs, and in particular self-driving cars. We visited Aeye and analyzed their technology in our report Toyota Motor: Innovate and Invest in the Future published on February 14, 2018. We view this partnership to be a sign of LG’s growing presence in the RIoT space and along with their aggressive research and offering of a variety of social robots in the home and in other spaces shows they are positioning to be a leader in the field. We believe these alliances, product launches, and aggressive deployments of social robots are all steps in the right direction and hold a very optimistic view of LG.
Robots in disguise
These are the droids you're looking for
The evolution of the social element of robotics is evolving in parallel with the development of robotics and the increasing interactions with people through the robotization of the world. When considering the social component being integrated into robots and the data accumulated through current social robotics systems we are of the view that companies that have a greater understanding of HRI and the know-how to leverage the components in a RIoT system will have opportunities in orders of magnitude beyond their competitors, as they will gather and control much more data about the physical world than can be simply obtained with autonomous cars on streets, mobile phones, and other components in existing big data business models.
We are now seeing robotization taking place in areas such as architecture and even furniture. The robots in these domains will interact with us in more intimate ways and more frequently than today’s social robots or even smart speaker systems such as Amazon’s (AMZN) Alexa or Apple’s (AAPL) Siri. Indeed, for such robots to be optimized for their tasks they must not only understand the primary user but also learn from other users who interact with them and this necessitates a greater understanding of social robotics.
Entering Q319 we see the following trends emerging in robotics, which will only accelerate going forward. Beyond the simple vacuum robot, other robots, including companion robots for the elderly and young are increasingly entering the home. Our view, however, is that the penetration of robotics into homes will be primarily realized via other avenues such as the roboticized architecture, robotic furniture, and other devices. NVIDIA (NVDA) and Ikea are working on a robotic kitchen assistant, which we may see on the market in the very near future.
Last-mile deliveries by robots will become commonplace with the entries of heavyweights such as Amazon, FedEx (FDX), and others, outdoor robot deliveries will become accepted by the public as well as businesses. For indoors, the numbers are coming in and businesses are realizing the tremendous savings realized by implementing autonomous mobile robots (AMRs) in their facilities. Companies such as Softbank-backed Fetch, Teradyne’s MIR, and in Korea, Yujin Robot are poised to benefit from this accelerating trend. Last year, Yujin Robot signed an MoU with Hanwha to collaborate on a project, which we believe will involve the integration of Hanwha’s HCR cobotics technologies with Yujin’s GoCart AMRs. The most successful company in the cobot market today is Universal Robot, which along with the AMR maker MIR were acquired by Teradyne (TER). There is a natural synergy between AMRs and cobots and we believe that Hanwha and Yujin have identified the immense market potential of this synergy.
The deployment of all types of robots is accelerating and more humans will increasingly find themselves living and working with robots. To help humans collaborate more effectively with robots, forward-looking companies will benefit by developing and integrating robust social functions into their autonomous systems.
Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.
Additional disclosure: Hyundai Motor Company is a passive shareholder in our bank.