<img alt="" src="https://secure.smart-enterprise-52.com/266730.png" style="display:none;">

ROBOTLAB BLOG

Everything You Need To Know About Robotics in Businesses


Download 2023 Higher-Ed Catalog

[WEBINAR] Human Robot Interaction with Furhat Robotics

RobotLAB-Furhat-Webinar

Learn about Furhat, a collaborative tool for HRI research and development. Hear from experts in higher education from Furhat Robotics and RobotLAB as well as our very special guest Chris Thorstenson, a professor from Rochester Institute of Technology, and his clever use of the robot.

 

Video transcript below. Peter:

Okay. Hello, everybody. Thank you so much for taking the time to join us today, it's sincerely appreciated. I'm gonna start off today's webinar by sharing my screen here. Alright. Everybody, today's webinar is robot labs human robot interaction webinar with furhat robotics, it's brought to you by the Rochester Institute of Technology furhat robotics and of course robot lab. During today's webinar, we're going to be covering a lot, we're gonna start off with an introduction to robot lab, we're going to be talking about the partnership between furhat and robot lab, we're going to follow that up by a demo of furhat and going into a furhat customer use case in the real world. Following the demo furhat, we will also be filling some questions, but the majority of the questions will be done during the q&a session to conclude the webinar.

Just a little bit about robot lab in a nutshell. We were founded in 2011, so we've been around for over a decade now, doing educational technology integration, our market is worldwide, right? So we are not only here in the United States, we are headquartered in San Francisco. But again, our market is global. So we have customers in pretty much every single country, globally. And so that's something that's very exciting. Our partnership with furhat it began in 2019. And just a little bit about that is that at robot lab, we're platform agnostic. And in order to be a platform agnostic technology integrator, you really have to cover as many different verticals as possible, right? So with that, you know, in 2019, we did not have that dedicated social robot as a part of our catalogue and that's where furhat came in. When we choose our partners, we have a carefully curated catalogue and we tried to make sure that everything is top of the line. And so it was a very natural partnership between robot lab and furhat, because furhat is best at what they do, right? They're leader in the industry and they're only growing and getting better. They recently acquired Misty robotics as well and they are growing that brand. And you know, there's lots of software updates that have already come and will come in the future. And I know that furhat is doing some other big things as well, with a conference of their own coming up. And that's something that our friends at furhat will talk about here shortly. And with that, I will hand you off to Susanna to allow you to meet furhat.

Susanna:

Fantastic, thank you so much Peter for that introduction. And hello from Stockholm, Sweden. Snowy Stockholm today actually, super good to meet all of you. I'm here with my colleague, Fabian, who you will meet in just a few minutes, and of course, also the furhat robot. But I thought before we introduce you to the robot, I just wanted to give a little bit of background about our company and what we do and our vision for the future. So my name is Susanna Dylan Beck. I am Head of Marketing and strategic partnerships here at furhat. And let's kick off, I will just share my screen, bear with me. There we go. And so furhat is founded here in Stockholm, at the Royal Institute of Technology in Stockholm, where our four founders carried out research.

The company was founded in 2014, so we very recently celebrated our eighth birthday. But the research began much earlier than that. And one of the key questions that our founders wanted to answer and explore, was how can we build a machine that allows us to interact with the machine the same way we do with humans? And while it may seem obvious now for all of us, especially in this call to talk with our computers and with our cell phones, and with smart speakers, this was obviously at a time 10 12 years ago, where that was quite rare. So they started off the research by trying to investigate what would make the users most comfortable, is it to speak to only a speaker or to a virtual avatar or actually a physical robot, and they quite early saw that the physical representation of the robot was a key point in creating a comfortable user experience for the user. So this is one of the very, very early prototypes of furhat with the actual furhat on it and there's a story to that, that I'll share with you. But the robot today has evolved to more looking like this. And this is also the version that you will see with Fabian here in just a few minutes.

But we quite often get asked the question, why do we call our robot and our company furhat. And it was so that in the very early days, even before the company was founded, the fore founders, they had been invited to the London Science Museum. And they had promised to bring a prototype of the robot to it, and to meet a specific deadline, and they were running a bit late with the robot. And when it was only one or two days left, they had a working prototype very similar to the one that you see on the picture here. But it was cords sticking out all over the back. But we're in nordic country, it gets cold, especially in the winter, and they found a furhat in the office that some of the other students have left. So they decided to put the fur hat on the robot, cover all the cords and brought it to London. And having accounted for the fact that it would be a lot of the media and a lot of other companies and researchers there. So the name furhat robot quite quickly stuck and that was it. When the company was later on founded, there was no opportunity to choose another name because the name furhat robot and furhat robotics was already established.

So if you're wondering why a robot has such a seemingly strange name, there's a very logical reasoning for that. And also a bit nice, since we're also from a nordic and fairly cold country. But in essence, what we believe in is that face to face interaction is the most natural, expressive and empathetic way of communicating with each other. And we wanted to bring that very human aspect of communication into a machine or a robot, and to allow us to the closest extent possible, to use the same kind of communication mechanisms that we use with each other.

And Fabian will show you much more of that and how furhat can express that, not only in speaking, but also with a very, very expressive face. And at furhat, we believe that the next big change in social interfaces is a social robot. So we went from having the computer, to a smartphone, to new kinds of technologies with VR, AR, and we believe that the social robot is the next big leap. It doesn't make the other ones obsolete, but this is really the key next step. And our goal as a company is to really progress that journey and to make sure that we have robots out in the world, but not any robots, but robots that allow us to really communicate as the people we are. And we are, I think on a good track on that, but of course, we're also working with a very long term horizon, as I think many of you also in this call know, we have come a far away and we still have a far way to go. But before I hand over to the demo, I wanted to give a little sneak peek into what different clients are using furhat for. We're going to listen to Christopher and later in this webinar, which is going to be super interesting. So this is just a short snapshot both from academia and from the commercial angle.

So looking at Academia first, how could furhat be used there? Well, we have one interesting example from Uppsala University here in Sweden actually, where the team there is investigating when and why the presence of uncanny valley appears in a robot interaction. And it's been very useful for them to have a changeable and customizable face because they have tweaked the different aspects of the face and have had users coming in return and do different kinds of interactions and being able to customise that face instead of having a fixed face has been a key aspect in them trying to figure out when the uncanny valley is present or not.

Another example is from the Heriot Watt University, where they are researching or investigating whether a robot operator could help when they are making seemingly very difficult decisions in hazardous environments. So one example could be if you are working with remote teams in battle, for example, could a robot operator help you in that decision making and communication? And in what way should we structure that interaction. And another example is from the Northeastern University, where they are exploring how we can have a long term relationship with a social robot, what needs to be present in that robot to form a returning long term relationship as opposed to just doing a very quick use case interaction.

So that's what that team is working on. And on the commercial or more corporate innovation side, we have one example also from Sweden with a company called Timguy who are creating the world's first unbiased recruitment robot. So the furhat robot coupled with timguy's software and skill is actually performing a screening interviews as the very first step of an interview process. This is live out working with clients. Of course, it's not the robot who is making the final recruitment decision, but this is a way to remove any bias of age, gender, ethnicity, from that very first and important screening interview.

So a very interesting way of using a robot. Another example is from Japan, from Bandai Namco, the legendary gaming company, they wanted to explore how they could bring their animated manga characters to life in the physical world, but still keep that very expressive and distinct appearance that their manga characters are used for. So then we created a replica of that on a furhat robot with a custom made mask and face to be used, for example, in some of the theme parks, instead of humans going around in these fluffy bear outfits that could also be done by a robot. And lastly, this is an example from Deutsche Bahn from Germany, who are looking into furhat multilingual capabilities, and have created a travel concierge that can help travellers in many different languages. When they are at train stations, or railway stations, for example, because they themselves as an organisation cannot afford or for practical reasons, have staff that can speak 30 40 languages. So they are exploring how they could use a robot for that instead.

 

So these are just some ideas and examples of what furhat can do, we're going to look more in detail when we hear from Christopher. But now I'm going to stop talking. And I'm going to hand the camera and the word to Fabian and even more importantly, to furhat.

 

Fabian:

Thank you very much Sussana for this introduction to furhat. As she already said, I'm going to take over with some practical demonstration here in a second. I first quickly wanted to introduce myself. So my name is Fabian Cap. I work at furhat robotics as a solutions engineer, and I'm also the lead of customer success at furhat robotics. And I will just quickly change my camera so that you can actually see the robot better, like this. And then you will also see me and us again, walking over in our huge studio, which is actually tiny but big enough to do this demo, I hope. So let's take it away. I will wake up this furhat in a second.

Alright, furhat it's time to wake up.

Furhat:

Hi there, Fabian, good to see you again.

Fabian:

Good to see you.

Furhat:

And hello there to you too. How are you today?

Fabian:

I am fantastic furhat, how are you?

Furhat:

Yeah, I feel pretty good myself this wonderful Tuesday.

Fabian:

Nice.

Furhat:

Alright, enough with the chit chat. Let's start today's presentation.

Fabian:

All right.

Furhat:

I am a furhat robot and I consider myself the cutting edge of social robotics. I'm here in this world to make technology more conversational and more social. You could say more human. Some people find human likeness in robots to be uncanny. But I can reassure you most people find me rather delightful. Would you agree?

Fabian:

Yes. It is a pure delight to work next to you furhat, I love it.

Furhat:

I have a very life like and expressive face.

Fabian:

You can see it can move different parts of the face and make different facial expression of all kinds

Furhat:

Can you do this?

Fabian:

What? With the iris? Yes, I do.

Furhat:

Okay, but how about this?

Fabian:

No, that I can't do and you know that?

Furhat:

Well, I didn't think so. My eyes can show where I am looking. Did you see how nicely the light is reflected in my eyes?

Fabian:

That's so beautiful when you stand next to it.

Furhat:

And my face have these automatic micro movements that create a feeling that I'm alive even when I am inactive.

Fabian:

That's an inactive spot, but it contains blinking and really slight movements of the whole skin.

Furhat:

My head moves in all directions. And personally, I love to be able to do like this.

It really makes me more expressive. Or what do you say?

Fabian:

Yes, if that makes you more expressive to be able to do that, then it's really cool

Furhat:

I can speak with over 200 different voices in more than 35 languages, and lip sync automatically to (inaudible 16:19)

The limits of my language are the limits of my world. Maybe a bit pretentious?

Fabian:

Yes, a bit pretentious indeed, even for you.

Furhat:

I can also make different sounds like so...

Fabian:

Nice.

Furhat:

It is also possible to use pre recorded voices like this one, and still have fun with them.

Fabian:

So that is my own voice recorded and put on to the robot.

Furhat:

I can have many different faces of different genders and ethnicities. And to make your own is not very difficult. Combine these capabilities, and I can portray lifelike characters. Should I do some characters?

Fabian:

Yes please, that's good.

Furhat:

All right, let's go. Oh, hello. I know this may seem a little bit strange but it's really (inaudible 17:32)

Fabian:

That's a real TV presenter from Singapore.

Furhat:

Greetings earthlings, take me to your leader.

Fabian:

Fantastic.

Fabian:

Yeah, I know, that was dope. That's because that's what I do. I do dope because I'm gimmy. But there is more to me than just my face. In a real conversation, having eye contact is very important. I'm looking at you right now.

Fabian:

Yes.

Furhat:

Move your head around and I will follow.

Fabian:

Alright. I cannot move a lot here, but I will try as much as I can.

Furhat:

Tell me when you want to move on.

Fabian:

Yeah. Let's move on. I think we should move on now.

Furhat:

I think that's enough.

Fabian:

Yeah.

Furhat:

I can even see where a person's attention is and recognize if they are smiling. So, when you walk up to me, I can detect you, look you in the eye, say hello, and return your smile. Let's try that.

 Fabian:

All right.

Furhat:

Please look away, and let's pretend we just met.

Fabian:

All right.

Furhat:

Hi, dear stranger, can I have your attention?

Fabian:

From now the conversation will not continue unless I turn my face to the robot again.

Furhat:

Now that I have your attention, how about a smile, beautiful smile.

Fabian:

Thank you.

Furhat:

Now that I know you are happy, we can continue with the presentation.

Fabian:

Okay.

Furhat:

I can also handle many humans at the same time. Why don't you introduce someone else to me or you can place a virtual user if there is no one else around.

Fabian:

Yes, (inaudible 19:17) is around but she's in a different room. So I will just go for the virtual user and then I will place them in this corner behind me when no one ever can stand. Okay.

Furhat:

Hi dear, nice to meet you, second human.

Fabian:

(Inaudible 19:34)

Furhat:

So you can see that I can handle multiple people and distinguish them from each other. I can also hear and understand what they are saying, like this. What is your favourite colour?

Fabian:

My favourite colour must be blue.

Furhat:

Wonderful choice. Blue is my favourite, too.

Fabian:

Awn, so nice.

Furhat:

This was just a very simple example of a conversation. I come with a powerful, extendable and easy to use dialogue framework and SDK, so that anyone can create interactions far more advanced than this.That was it for my part for today.

I hope you enjoy the rest of this webinar with robot lab.

Fabian:

Thank you furhat, such a pleasure to have you.

Furhat:

My pleasure.

Fabian:

So, and with that being said, that's furhat is back to mute, because otherwise, we just continue to talk for the rest of the evening. And I give the word back to Sussana and Peter.

Susanna:

Yeah, thank you. Thank you so much Fabian. And as Peter mentioned in the beginning, if you have any specific questions, more on the demo and of furhat's capabilities, we can also take that now. While you're thinking about that, and adding it into either the chat or the q&a tool, I thought Fabian, it would be nice to just show a little bit both, yes, on the masks, and then also on the web interface. Because when you only see the demo like this, you get one perception, but it's nice to show a bit of the background on what's going on. So I'll leave it to you and we're here from the background.

Fabian:

Yes, sure. So, I'll show the mask first, you have already seen quite some different faces of the furhat. So that's only one of the dimensions in which furhat is customizable. Another one is the actual physical mask that you can put on the furhat. We have already seen that in Susanna's presentation for the Bandai Namco for example. We have actually four of these masks we have here though, we have the pointy nose, anime character that I would put on in a second, we have a child and we also have a dog. So I will just show you how this looks and then you can add some heavier to the robot as well, you can really completely change its appearance, I will make it look to you as well so that you can see it better, like this.

Now you can see the robot and if you then select a nice voice that sounds like a child. For example, I have a nice German voice, that is called Leal I believe, let's try this.

Furhat:

Hello, (inaudible 22:30)

Fabian:

Then all of a sudden, it gives a whole different impression than the adult male voice robots with the male adult face that was speaking to us for most of the presentation. So, there is really almost no limit in how you can customise the appearance of the robot and the emotion that the robot provokes when you are in the same room as it. And then I also would like to show, I promise to during the presentation, I will show you how I edit the virtual user. That will be this now, I have to go back to my computer and share my screen, that will be good now. Alright, let me see. This is my screen, here we go. So this is the web interface that is connected to the furhat robot. Can you see my screen?

Sussana:

Yep, all good.

Fabian:

Great, yeah, so this is the web interface. You can see here how close I'm actually sitting to the robot. And you can see myself through the robot's camera. And you can see my face and some points that the robot identifies in my face. This is used to identify my face. And also to see whether I'm smiling, we have this as a part of the demo, where the robot is able to detect whether I'm smiling or not, depending on the point in my face as you see here. And then apart from that, if you want to add a virtual heusser, which can be really useful, especially for debugging, we can use this view here, where you can see that the attention space of the robot basically, what happens very close to the robot or far, I can see myself here being really close to the robot, I can also go further away, and the robot also sees where I am looking.

And that you can see that we have in the background here a misty robot that the robot also tries to identify as having a face. So this face identification is really working well. And then if I want to use a virtual user, I can just double click and add those here and so the robot will address them as if they were users. And depending on how you programme it of course, but this can be a very useful and very powerful tool. And then apart from that, I can show you quickly our skill library. So this is our demo robot. So there's a lot of skills here that we use to demo the robot.

These are skills that I have manually uploaded to the robot myself. But then with the robot comes a so called furhat skills library. These are different kinds of skills that come directly with the robot. They are not yet physically on the robot, when you turn it on for the first time, however, you can very smoothly download them from our GIT repository, the latest version, and then just have them on the robot, you can see that I did that with the first five of them. And you can see that three of them have recently gotten an update, which I am purposely did not do, because they'll want to show you this. Okay, I could go on talking forever like this. But I will give this back to Susanna for now, these were the main things that we wanted to show you.

Susanna:

Yes, super, thanks Fabian. And I can see there are some great questions coming in. Maybe one that would make sense to answer now and some of the others we can save for later. That is around the different programming languages to work with the furhat SDK?

Fabian:

Yes. So furhat SDK is mainly written in kotlin, which is based on java, and kotlin is one of the fastest growing programming languages to my knowledge at the moment. And that has to do with the fact that kotlin is used for Android programming, and kotlin is really easy to use, I believe. And we also have like a domain specific language that sits on top of it that you can use commands, for example, like furhat does say and then it says something, furhat does smile and it smiles. Like it's very intuitive. I myself have not worked with kotlin before I joined furhat robotics, a little over a year ago, but I had a very easy time coming in because I had learned some other programming languages. So this has nothing to be afraid of. Then, of course, we have two more ways to programme the robot in case you do not want to use our SDK.

You can also use Google's blockly programming languages that we also provide for the furhat. Yeah, and Susanna has a nice example here. So this is more of a drag and drop, like very visual representation of what the furhat does. You can say...you can see here that the furhat says hello then attends to the user. Attending to user means that the furhat follows the user with its face, like I showed it in the demo. And then you can go on from state to state. So the whole dialogue framework, regardless of whether you're using our kotlin programming language, or whether you're using this blockly language is a state based dialogue system, where you go from one state to the other, depending on what happens in the first state. And you can have different transitions for different states, it's quite intuitive to use and follow.

Finally, I said there were three possibilities. You can also use your own, you can also rock your own show on the furhat robot. So if you have a dialogue system like google, like from google dialogue flow, or it can really be any dialogue system and you want to run it on the furhat, you can do that by just using our remote API that allows you to see the robot and use it as an input output device. It sounds a bit hard, but this is also possible and you can use your own software on it so yeah, there's alot of possibility.

Peter:

Awesome. Thank you so much, Fabian, and thank you Susanna for all of that fantastic information. Right now what we're gonna go ahead and do everybody is we're gonna go ahead and get into our real world customer use case. Again this part of the presentation and the webinar is brought to you by the Rochester Institute of Technology and Chris Thortonson. Alright everybody, so Chris Thortonson, unfortunately he could not be here today but he is the assistant professor at the Munsell colour science laboratory programme, there at IRT. His research is on how people perceive and evaluate other people, social robots and also virtual representations of people. The aim of his research and his programme is to better understand the ways to facilitate not only human to human interaction, but also again, human robot interaction. So this is a very interesting stuff. And here is Chris on his research.

Chris:

So, my name is Christopher Thortonson. I'm an assistant professor in the monsell colour science laboratory and programme of course science at the Rochester Institute of Technology. And I'm excited to talk to you today about some of my research on visual and social perception of colour appearance and artificial faces. First, I'd like to take the opportunity to introduce our department IRT, we are all highly disciplinary departments focusing on providing education, research and outreach related to various fields of colour science including colour vision, the process by which we perceive colour, colour appearance modelling, so working on equations and algorithms that help predict the appearance of colour, colour measurements, including spectral and colour metric measurements of materials and light sources, colour appearance and emerging technologies like virtual reality and augmented reality and colour physics.

Understanding the interactions between light matter and optics and colour cognition, which is how we think about colour, for example, how we form things like colour associations and how we use colour to communicate information and graphics and visual media. My own research is mostly interested in how people perceive and evaluate social agents. And by social agents, I mean persons and representations of persons that communicate and interact with people in a social way. This obviously includes real people and display the people through photo and video or over zoom. But it also includes artificial social agents like social robots that range from highly realistic and human like robots, as well as more abstract social robots like Alexa, for example, that are still social robots because they're designed specifically to interact and communicate with people but they don't attempt to look like people. This also includes virtual social agents that we designed to communicate and interact with others like avatars and emojis.

My research in this area focuses on visual features like colour, appearance and colour heterogeneity, which is distributed colours across this facial area of the face because a person's face isn't just one single flat matte colour, but rather an uneven distribution of colour that varies across an individual's face. I'm also interested in textural details like wrinkles and lines and smoothness, as well as transparency which is actually more relevant in virtual environments. And I focus on how these impact the evaluation of social agents mostly looking at things like realism, so how real they appear, preference, what visual features contribute to agents we prefer to represent ourselves or interact with, and social intensive like emotional state, attractiveness, healthiness, personality, and so on.

So to extend some of these research interests that have been talked about in the domain of social robots, I'd like to introduce you to the newest member of our lab, who is a furhat robot we recently unboxed and voted to name, Alex. So just a brief background on why we named Alex. Going back a little bit in history, Albert Munsell, who is our department's namesake was very foundational in the field of core science, he created a popular and still widely used system called the Munsell colour order system that characterises and organises colour based on relevant dimensions of the human visual system. He ended up creating a foundation with a goal of furthering education, research and outreach and fields related to colour science, which eventually turned into our lab, the Munsell colour science laboratory, and he had a son, Alex Munsell, who continued to be influential and furthering research and education interest in colour science.

So given the criteria that we wanted it to be a gender neutral name, we thought Alex was a very apt choice. So we're very excited to welcome Alex to our lab. Okay, next I wanted to talk about, I want to talk about a few different research projects that I'm planning on pursuing with our newest lab member. First is about trade offs between realism and preference in rendering the appearance of social robots. Our next is an investigation into changing facial coloration and emotion perception. And I'm also interested in talking about user centred approaches to understanding artificial face perception.

So first, I'm really interested in trade offs between realism and preference when rendering artificial social agents. For example, you can imagine that there are some visual features like high textural detail and moderate colour heterogeneity that contribute some more highly realistic representations of people, which makes sense because these things are present in real human faces. And so these are some things that a lot of workers and things like graphics and video game design and CGI designers have really tried to include and improve to make their products as realistic as possible.

However, when people are doing things like designing avatars, and editing photos and videos of themselves, they tend to prefer to smooth out some of these features or to remove them completely. And they also tend to prefer images of others that do as well. You can see some examples here, where these features are slightly very subtly different. The first image is one of the default models, which is more realistic with high textural skin detail with wrinkles and lines, has some amount of colour heterogeneity where the whole face isn't the same exact colour but has slight variations across the face. And the next two pictures show that same identity, but one with a more smooth but not eliminated textural details, and the other has a more uniform face colour distribution, which are things again, people tend to refer at least when generating or designing representations of themselves.

And it's important to note that even though these differences are very subtle, and you can see that they're very small, people are actually very good at noticing small differences and variations in faces. And these kinds of things can change... it meaningfully change and influence how we evaluate and interact with others. So I think it's important to better understand how these kinds of trade offs should be made between realism and preference, depending on their use case.

I'm also very interested in facial coloration, which is when the face changes colour slightly over time when experiencing emotion. So the best examples are blushing where the...something like embarrassment or anger, or paler, which is when blood drains from the face and it becomes paler with fear or sadness. But I've also done a bunch of research on how people associate face colors with a lot of different emotions. And it turns out that people are pretty good at making emotion inferences for a lot of different emotions with a wide range of face colour changes. And here I can show you an example of what the robot looks like when blushing. Playing this video now. And here, I'm just looping this blushing animation just to make sure that you can see it. So this is what it looks like when I incorporated this blushing into the robot. And I'm interested in a few different topics on this kind of artificial blushing. For example, I'm interested in visual detection.

So how much does a robot have to blush to visually notice that it's happening at all? And motion detection, so how much and in what ways does the robot have to change colour to make them look like they are expressing different emotions?

Augmenting emotion communication. So in other words, how can we use colour to enhance the emotion expressivity of robots? And artificial extrapolation, which refers to the idea that this phenomenon blushing is a natural thing that occurs in humans, but it's also limited by our physiology like mechanisms involving workflow and the physical characteristics of our skin.

However, robots don't have these limitations, so it's interesting to explore whether we should just be copying this phenomenon onto robots like this or if there are other ways that we can use colour on the face to help them communicate emotion in ways that real humans can't, for example, with different patterns, or wider, more obvious change in colour, or things like flickering, display other ways to artificially communicate that human faces can't do.

And finally, I'm really interested in user centred approaches, whereby we study perceptions of things like realism and preference, with individually customizable stimuli instead of using the same kinds of stimuli for all people. This is one example of how I think a pipeline like this could work. So first, you'd have someone come in and sit in front of the robot. And actually one really useful thing about this robot is that it has a camera built into it apparatus, and a suite of base detection and tracking algorithms as well. So that means we'll be able to capture, video detect and track the user in real time. Well, then I wants to calibrate the camera data using standardised colour calibration methods. And with that, we'll be able to extract out a more accurate representation of the user's face. From then we can essentially pipe that representative colour back onto the robot face, the actual robot face to create an accurate match. So this is essentially generating and individualised robot avatar, or at least for this one more feature, and then the user could interact with it finally, for example, they could then change it with a mouse or keyboard. Change the colour of the face with the mouse or keyboard if they don't think it matches very well, or if they want to make adjustments to make it more preferable, for example. I think this kind of research approach is important. And it's the reason I'm really interested in it, because it's most likely that people think about and evaluate face appearance very differently when it's their own and not faces of other people, which is what the majority of this kind of research is done on.

And finally,I'd like to conclude by sharing why I think that the furhat robot is specifically really ideally suited for my specific research, I'm really impressed by its realistic capability and dynamic display. There are a lot of really subtle things like micro movements, and eye blinks, and head movements that I think really make it feel like you're interacting with a person instead of just an image or a video. And actually, it's really funny, because when our IT staff came in to help us get it connected and set up, it would keep detecting our faces, and keep moving its head to make eye contact and follow us around the room, which was honestly very, very surprising. The customization of the appearance is very flexible. As you can see, as you can imagine, my research involves making a lot of very subtle appearance changes that... And I've been able to do that pretty easily in this platform. And again, some of my research finds the integration of features and workflow to be really useful, for example, being able to easily communicate with the camera data and being able to address the display with other software like MATLAB, for example, which is necessary for putting together my experimental practice.

And finally, I have to give a shout out to the support I've received. Whenever I've had a question, there have been multiple people that instantly respond and meet with me. And whenever I'm trying to do something that's maybe not very common, or something I haven't come across or might have thought about. They are really helpful in thinking through things like workarounds, and future implementation. So a really, really big help there.

Well, thank you very much. I really enjoyed talking to you about some of my upcoming research in this area. If anyone has any questions or just wants to talk about what kind of research please feel free to email, I'd be happy to chat. Thank you.

Peter:

Alright, everybody. Thank you so much, Chris, especially for that insight into his research. That was very informative, and fascinating, so thank you so much for him. As he mentioned, if you have any questions about his research, he's been very generous with us so far. And he did have his contact information there, if you have more questions about that.

That concludes the presentation part of today's webinar, we will go ahead and start the q&a session now. My name is Peter Bowman, and I'm an account manager here at robot lab. So I will be working with all of you going forward into any potential projects that you may have with furhat and be able to answer any questions not only during today's q&a session, but if you have questions that we may not be able to get to today, or, you know, you just don't think of during today's session, then you can feel free to reach out to me by email at Peter@robot lab.com. And so I'd be more than happy to talk to you guys there. I see that there's lots of questions going on in the chat.

And so let's see what we have here. I see our old friend Anshu asked about using furhat with autistic children and Anshu the answer is absolutely, furhat can be used with research dealing with autistic children. So that's absolutely a possibility. I see lots of people praising the demo. Let's see what we have here, we have Jeanette asks the robot is very interesting, can the robot determine if a student is using the correct sounds when working on specific words in counting correct speech? Susanna and Fabian, do you think you might be able to fill that question?

Fabian:

Yes. The question about whether furhat is correct and correcting speech of the participants, right?

Peter:

Correct.

Fabian:

Yes, so the furhat itself, unfortunately, is not able to do this. This has to do with the speech recognition software in some of the robot. We are using the best speech recognition software that is out there at the moment, which is Google ASR. You can also use Microsoft ASR with the robot and it comes with the robot, license come with a robot when you buy it. But if you want to correct people, then you would have to kind of build around it. So what you would have to do is you would have to, yeah, probably give some clues to the speech recognizer for typical errors that you are expecting, then you can build around it. Or you would have to use your own setup. But the robot itself, like the intelligence, the SDK comes with the robot itself is not able to do these kind of things automatically. The robot doesn't know how things are supposed to be pronounced or something that's not part of the other platform. But you can reach out to me directly, and we can have a discussion, a more detailed discussion about this, if you want.

Peter:

Thank you so much, Fabian for that answer, I appreciate the assist there. I see that Shoddy has asked that to improve speech with autistic children, can the company help to programme it with specified curriculum? Currently, we do not have specified curriculum to dealing with autistic children and the furhat robot. But you know that's definitely something that that could be a possibility in our pipeline in the future. So thank you for that question. Question? We also do offer presentations over zoom, whenever you would like, you can reach out to me again via email at Peter@robotlab.com and we can go ahead and set up a time for a demo. Looking at the q&a section, it says is furhat available for mass production, or is it just a prototype? And the answer to that question is that furhat is absolutely available for mass production.There are an unlimited amount of furhat available. So we can absolutely get you however many, whatever the quantity is that you need, we can deliver that. Jennifer DeForest asks, has the robot been used in health care? And for that question, Fabian and Susanna, I'll go ahead and pass that back on to you.

Fabian:

Yes, one of the I would say the key use cases that we have in healthcare is a collaboration with Merck as the global pharmaceutical company, where we are exploring whether a robot can be used as a medical screener. And specifically in this case, we are screening users for common but undiagnosed diseases, such as diabetes, and hypothyroidism and alcoholism. So the screening itself is based on existing and verified interviewing questions. But we put it in a robot format, also, to be able to put it in different public locations. And one thing that we are investigating is whether we can reduce the stigma of being asked questions about your weight and your lifestyle by speaking to a robot instead of a human. I can actually post the link here while we continue the q&a to that use case. And you can look into it a little bit more. So bear with me here a second. Do you want to take the next question?

Peter:

Yes, absolutely, thank you. Thank you for that prompt. So I see... Steph, asked, would you like to comment on the similarities and differences between furhat and the soft pink robots? That's in reference to the robots now and pepper. And the answer to that question is that they're very different, right? And, as far as a robot lab is concerned, as I mentioned, towards the beginning of today's presentation, one of the best parts about working with furhat and robot lab is that robot lab is platform agnostic. That means we really listened to your use case, we listen to your needs, and then we deliver a solution based on what the best fit is, right? So for example, the NAO robot may be more advantageous for folks in a like a K 12 setting, for example, it's very, it has a wide range of use cases, it's able to teach math, it's a bipedal robot, right? So you can programme it to do cute little dances and things like that. But that's just not the purpose of furhat right? Furhat is really a development tool and a learning tool that's mostly used in higher education. I don't know the extent to which it's been used in the K12 setting as far as in furhat history, as far as robot lab goes in our experience. It is a normally a higher education offering, so that's probably the biggest difference. And going into about pepper, again, it's just a larger robot, that has a much different use case than furhat. So when I'm working with my customers, again, I listen to everybody's needs, and we have a multitude of different platforms and we will always offer the solution to best fit those needs.

Fabian:

Yeah, when you were answering Peter, Susanna was posting some use cases that we have in this education setting, and also many of our customers are using the furhat, you know, our researchers that use the furhat for their personal research. But then as a side hustle, they also use it for how to teach programming, because most researchers also have to do, fulfill some teaching duties, right at the universities, it's just a common setup. And so they use the furhat to teach programming because we also have a virtual version of the furhat, that you can very easily distribute to your students. And in order for them to be able to use this virtual setting at home to learn programming and to fulfil their tasks that they get from the teacher. So yeah, that's just something I wanted to say to complement what you said. And as I said, Susanna, pasted some links to some use cases that we have in this setting as well.

Peter:

Very interesting. And thank you so much for that expansion Fabian. That's definitely sincerely appreciated. I see. Jeanette has a question. Can the robot translate from one language to another? Can it hear Spanish and translate to English? Can I ask a question and respond to a correct or incorrect answer? And the answer is yes with development, right? So if you create a chat bot, that's what gives for audits intelligence, right? You have to hook it up to something by default out of the box, if you ask furhat who won the Superbowl last year, that's not something that furhat is going to be able to answer. However, if you put in the development effort, and then it's a possibility. So that's what I would say for that. Moving on, I see Rudy, is the virtual platform free?

Fabian:

Yep.

Peter:

Yep. So this is a part of the SDK. And Miriam...Go ahead. Sorry.

Fabian:

I just want to...So if you are, if you want this for free just for yourself, just sign up on our webpage for free SDK access, we grant this to you, no matter what. If you want larger quantities, as I was saying, for teaching, like if you need like 100 licences in one go, we really appreciate if you reach out to us, you will get them but reach out to us and you will get them in one batch, so that's simplier for you and simplier it.

Peter:

Fantastic. Thank you for that explanation as well. I see, is it programmable by Python? I believe the answer is yes. Is that correct Fabian?

Fabian:

Yes.

Peter:

Next, we have, is there a low level access to the robot system? For example, is it running on a Linux board? I can log it and directly control the robot on low level.

Fabian:

Essentially, what do you mean? but you can access the robot via the remote API so you can steer it with whatever flexion you have.

Peter:

Absolutely. Maybe, we can have a follow up email conversation following today's webinar on that question. Eric asks, do you have any research on improvements in the teaching and learning process when using robots? Yes, there is that kind of research out there. Absolutely. And following today's webinar, Eric, I can reach out to you with some content as far as that question goes. So thank you for that question Eric. Duvall says is there a curriculum for robotics class for high school available? Duvall no, there's not a pre baked curriculum available at this time. But again, that may be something that's available in the pipeline in the future. I would imagine, based on my experience, both in higher education in the K 12 market, right? What we offer here at robot lab are learning tools to execute the curriculum that you really already have in place, right? So if you have a programming course, or a coding course, in different languages, then furhat is a tool for you to be able to teach that and get creative with, but currently we do not have a pre baked curriculum. I am going to be looking at our q&a again here.

What is the price and what are the commercial conditions? Okay, so as far as the price for furhat, that's a fantastic question. Thank you so much for asking that one. The best part about my favourite part about furhat is that they again, they're always growing and they're always improving, right? So with that, they actually recently created three different models of the furhat offering. And that's a basic standard and an enhanced package. And they have different prices, if you would like, what we can do is, I can send you a price quote and go over those differences after today's webinar. I just want to make sure I can get to as many questions as possible. I don't see any more questions in the q&a. I do...we do have one more question, is furhat available for purchase in different countries. Yes, furhat is available globally.

Fabian:

One of our customers is asked me in the chat from the Benuda islands. So we really reached very remote locations. And I will come back to you then directly at the latest tomorrow, it's end of day here. But at the latest tomorrow, I can answer to your question personally. And also Susanna wanted to say something.

Susanna:

Yeah, if I can just hijack the end here for one minute, you mentioned in the beginning Peter that we also have a conference coming up, which we have here in a few weeks, typically, at least twice per year, we gather everyone in our global community, where we invite our clients, similar to Christopher here to speak about their research and the work that they are doing. And we also show the latest updates of furhat. As Peter mentioned, we actually recently acquired an American company Misty robotics that some of you may have heard of. Misty is also used even more, both for children with special needs and within education. So in this conference coming up, i'll post the link to it very soon here. We will hear from both furhat and misty users. So that could also be interesting, where you can join, it's free. It's two and a half hours in suitable American time.

Peter:

Fantastic. Thank you so much for that conclusion Susanna. Susanna, Fabian, it's always a pleasure speaking with you, thank you so much for helping us out during today's webinar. If you, if any of you would have any questions about furhat following today's webinar, or if you would like price quotes, please reach out to me directly at Peter@robotlab.com. and I'd be more than happy to help you out. And I know that there are some extra last second questions that are coming in. Unfortunately, we do need to start to wrap it up. But again, I'd be happy to talk to you guys post webinar via email at Peter@robotlab.com. Furhat robotics, thank you so much. I look forward to continuing to work with you guys in the future. And I hope everybody has a wonderful rest of their day. Thank you.

Fabian:

Thank you so much Peter, thanks everyone.

Sussana:

Thanks, bye

0 Comments
  • Sep 12, 2022 8:30:28 AM

Relevant Posts

Popular Posts