Podcast: Play in new window | Download
Subscribe: Apple Podcasts | Spotify | Pandora | Email | TuneIn | Deezer | RSS | More
Let’s talk about digital identity with Monique Morrow, President and Co-Founder of the Humanized Internet and President at the VETRI Foundation.
We’re very excited to kick off #LTADI 2020 with Monique Morrow, multi-hyphen technology innovator and a Forbes Magazine’s top 50 women globally in tech.
[Scroll down for transcript]
“2020 is going to be the year for digital identity and, even more so, self-sovereign identity”
In episode 13, Oscar and Monique discuss her route to digital identity, ethics in technology and credentialing, self-sovereign identity (SSI), and the various interesting projects that she is involved with.
Monique Morrow is President and Co-Founder of the Humanized Internet, a non-profit organisation focused on addressing the need to control our identities as well as providing digital identity for those individuals most underserved. The belief in the social good of technology with embedded ethics has guided Monique’s extensive work with blockchain, especially its applicability to education and credentialing as well as other industries including healthcare, insurance, and Internet of things.
Find out more about the Humanized Internet at www.thehumanizedinternet.org.
Monique is also President of the VETRI Foundation in Switzerland. The main purpose of the Foundation is to manage a platform presently known as VETRI and the funding, establishment and execution of initiatives that are focused on the management and control of data and privacy. The Foundation abides by the key tenets of “Trust and Transparency”. The vision is to enable individuals to self-determine over their data. This alignment translates to assessing possible investments and activities towards secure self-sovereignty and secure e-vault mechanisms for the management and storage of data.
Find out more about the VETRI Foundation at vetri.global/the-vetri-foundation-is-here.
Much of Monique’s work operates at the intersection between blockchain technology, security-privacy issues, questions of legal jurisdiction, and portfolio development. She has had the opportunity to engage with and explore these issues in her capacity as a member of the procivis.ch and VETRI ’Global advisory boards based in Switzerland . Furthermore, she is also an active member of the IEEE Ethics in Action Executive Committee as well as Co-Chair of the IEEE Ethics in Action Extended Reality Committee.
More about Monique can be found on LinkedIn and at www.moniquemorrow.com.
We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!
Go to our YouTube to watch the video transcript for this episode.
[Podcast transcript]
Oscar Santolalla: Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host Oscar Santolalla.
Hello and thanks for joining today. We are starting the New Year 2020 and I hope you had a nice time in 2019. Now we are back and we have a fabulous guest to start this year – a guest who has an amazing career in technology, in cybersecurity and today she has embarked a lot on projects for combined technology with social impact, and a lot of that is also related to digital identity.
Monique Morrow is President and Co-Founder of the Humanized Internet, a non-profit organisation focused on addressing the need to control our identities as well as providing digital identity for those individuals most underserved. The belief in the social good of technology with embedded ethics has guided Monique’s extensive work with blockchain, especially its applicability to education and credentialing as well as other industries including healthcare, insurance, and Internet of Things.
Monique is also President of the VETRI Foundation in Switzerland. Among other accolades, Monique has been recognised in the industry for her tireless focus on social good. Monique was selected as one of the Top Digital Shapers 2018 in Switzerland. In this year, One World Identity recognised Monique as one of the top 100 influencers in identity for 2019.
Hi Monique.
Monique Morrow: Hello Oscar. It’s a pleasure to be here, to talk about an important topic.
Oscar: Thanks a lot. The pleasure is mine truly and I really want to hear this latest project you have. I told a bit about you but that’s not enough so I would like to hear more. So tell me a bit of your journey, how your career led you to this world of digital identity.
Monique: Sure. I mean look, first of all, I should say a Happy New Year and 2020 to everyone because I do believe 2020 is going to be the year for digital identity and even more so, self-sovereign identity.
So the journey is the following where I got very much involved in the space and it goes into how I got involved in terms of blockchain and credentialing.
What we were finding is that identity is a very important component especially if you’re talking about people who don’t have one maybe because of a crisis, maybe because of war, whatever the situation is, and they don’t have their papers anymore.
Identity is a very multifaceted theme because it also pertains to culture but it also pertains to how you are perceived and what rights you have when you come into a country or into an organisation or wherever.
So the issue around digital identity is really around what happens, what is digital, what is acceptable? What are the exact components of that? I think what really fostered that journey which we will discuss during – the actual components which we will discuss during the course of this interview was in fact when I first established and co-founded the Humanized Internet, which is a Swiss-based non-profit in Switzerland. And I have a co-founder based in Toronto, Canada and a co-founder who is really the heart of the story who is a former refugee and ‘refugee’ is just illegal status.
He’s a person, an individual who happen to come from the Middle East and particularly Syria and was studying in Greece and literally lost his passport. Literally lost his passport and although he could be seen in the system as a student, he wasn’t in the system.
It didn’t matter that he had documents that were on Google Drive. They just simply were not going to be accepted and there was no way for him to go back to a war situation, to go back to a country that is at war, in this particular case Syria. He would have his life in peril.
So it really fostered the story about how can we actually have a world where we can have and be in control of our digital identity such that they could be accepted by other organisations. And that’s what got me very much involved in identity overall, digital identity, self-sovereign identity and I have colleagues who have been working in the space for many, many years. So it has been a journey. But now we’re looking at, you know, how do we deal with credentialing.
So it has become something of – you know, I don’t talk about it at a theoretical perspective. I talk about it in really concrete examples and concrete examples working with people who are the centre of the stories that – and narratives that we read about every day.
Oscar: Yeah, this is a very concrete example and it’s hard to believe that someone – you just lose your passport and you don’t have an identity. That it’s really hard to believe that.
Monique: You know, it’s incredible because it’s just like a scene out of a movie. We actually published a book called The Humanized Internet and the tenets around The Humanized Internet is not only around digital identity as an end control, but it’s also around the notion of ethics and technology and the whole issue about how do we actually put some governance model around it.
But it was like a movie for him and he couldn’t go to a consulate or an embassy because there is none. It’s a country at war. So he was extremely in a very stressful situation. It is a situation – now he’s living in Germany. He’s living in Berlin and I think that he’s contributing to society in that sense. He’s a software engineer in development.
But that journey he represents is a story of many. We have to think about questions. What happens if there’s a terrible earthquake? What happens if there’s a fire? What happens if your organisation no longer exists? How do you actually credential but also how do you actually get a component of digital identity such that it carries a weight and that you are under control?
So digital identity as a topic has several components here. One is where you have some level of control. I will put that in quotes. The other is what is being profiled about you. This gets into the story of companies actually making money and profits out of your identity or your presence on the internet where you’re not involved in that exchange.
The only exchange, the involvement that you have is that you’re given “free services” which really aren’t free at the end. So your presence is being – and over the net is being sort of profiled or used and used for – could be used for marketing purposes and it could be used for nefarious purposes.
So we have several aspects of this notion of digital identity that moves to self-sovereign identity, that moves to some point of how do we now go into the 21st century and beyond and look at what does control look like. What does a digital apostille mean and how is it going to be accepted by many countries and organisations?
So if I take a step farther Oscar, which gets – you know, because I’m in the blockchain world and we talk about that a little more. But credentialing is an opportunity here. How do you have universities, maybe in Helsinki or so on, that actually used this technology to actually provide you with a hash or a key which is referenceable on the public blockchain by the validating organisation, in this particular case the university, forever there? Because in case something happens, at least you can reference it.
If you’re a refugee or if you’re somebody in– a person that happens to be stuck in a humanitarian crisis, what you want to be able to do, even though you can use certain technologies like artificial intelligence to actually have predictability, a predictor exam. We think, oh, let’s go study computer science and then have a certification, credential given that is by the validating organisation, a new organisation that is referenceable at the blockchain.
But having said that, if you are resettling, which is a legal language, and you’re moving for example from Jordan or from whatever country to a receiving country, let’s say Sweden for example, you want the receiving organisation to accept your credentials so that you can work.
The problem is when it’s no longer acceptable by the receiving organisation, it takes– people spend quite a bit of money, 50,000, 60,000, 90,000 euros or more to recertify and credential. That weighs on the social system because what happens is they end up trying to get help from the social system and it plays out to – unfortunately to a negative political narrative.
Oscar: Yes. How big is it probably in terms of how many people worldwide are affected or potentially affected by this lack?
Monique: I mean, you know, the numbers are going up and I think if we look at UNHCR report, it has gone from 65 million to 73 million, thereabouts – I mean circa – displaced people. We can make a correlation to certain things that are happening to this displacement. It’s not just for – you know, the climate crisis is one example. People have – or migrating this. There’s a migration component of it. There’s also displacement overall because of these crises and people are seeking humanitarian aid.
The thing of it is, is that with that, it gives – people have a human right to work. In most cases, they really want to work. They want a “better” life. So you’re dealing with all – it’s a very complex situation. But I think that an example that’s often cited is the Rohingya tragedy and crisis.
So people are stuck in these horrific camps and they want to be recognised as a people. They don’t want to be called a refugee. By the way, people don’t like that term. They are people who want to contribute to some extent to society.
What happens in a case like this – and this is why technology can be an enabler for digital identity and looking at provenance of where you’re coming from etc., is that they – because there’s no recognition, because there – you don’t have – maybe you have some level of recognition from UNHCR. You’re in somebody’s database. You’re subjected to human trafficking and human trafficking is a huge humanitarian crisis that we have to – you know, it’s at all levels and it happens – it’s happening in our own backyard because people – what happens is any kind of whatever identity that they have, even if they don’t have any, when they don’t have any, they’re even more vulnerable.
So it is incumbent upon us as a society and a group of technologists to actually work together with governments and organisations to actually see how we can take the issue at a level where we can have recognition digitally, right?
And it will be forever with us. So with an example that I get – I think 2020 is going to be a year of self-sovereign identity, SSI. There you are in the middle. It’s portable. Your credentials are portable. You have the right to selectively disclose parts of what you want to disclose about yourself and actually look at where that data is going. It gets into privacy. That gets into how the data is being used and so on.
I think that we’re going to see more and more of selective disclosure, the notion of self-sovereign identity becoming more of a mainstream at the end of the day and especially for 2020 moving forward.
Oscar: Yes, definitely. It’s a really big problem as you described it pretty well. So what about The Humanized Internet as a project. You already of course said something about that. But tell us the – what is the status?
Monique: Yeah. So we started out as – we are three people and as an association, it’s an association as recognised in Switzerland. It’s non-profit and fundamentally what we have been doing from the perspective – one of the big projects is really amplifying some of the issues we’ve been amplifying based in medium and actually in our forthcoming publication.
Having said that, the other issue is we’re looking, you know, for example in wanting to work with some universities here in Switzerland to look at how we solve for the issue of storage of our credentials. Not an essential database. It’s something that should be distributed. And how do we have this notion of digital keys or what we call lock boxes?
That’s something that I think that is still – it’s a very hard computer science problem to solve for. But I think it’s something that we’re very interested in solving for because everything can be – if I have a trace in a bank or a vault, but that gets destroyed, at least I have some notion of where I hold certain data or certain credentials or something about my not only identity but about values or something that I store in a digital format such that they – it’s distributed such that I may have something of a notion of digital keys that I can go with, members of my family, so if there is an incident that occurs, there would be – let’s say an event flag that’s sent out and there would be a lock, right?
And I think that we need to solve for something that is at that level. It’s something that we’ve been thinking about for some time as a project. We have been looking at the notion of could we make having some sort of an identifier if you will, decentralised identifier as a Humanized Internet. But more importantly within the context of self-sovereign identity as let’s say a standard. That would be fantastic.
We realised that there are a lot of organisations, and I should say quite a bit of organisations, involved in some level of identity or digital identity. It’s crowded space but it also means that there’s an opportunity here.
So for us, we’re looking at the next level of problems that need to be solved. So in summary, it’s really about what does a digital box look like and how do I distribute digital keys to close members of family and friends?
Oscar: So at this point, the project is designing this platform.
Monique: Yes.
Oscar: So the goal is to have in a few years.
Monique: Yes. It’s a process to design it and work with an ecosystem of organisations, but particularly universities, to look at how we can do that together.
Oscar: Yes. And in the – how do you envision at the time this platform is already operational? Who will be the main actors that somehow maintain that – let’s say government or universities, can be also companies? How do you see it?
Monique: I would say that the ecosystem would involve universities. I mean we want this to be a standard globally, right? You cannot do it in the absence of government and the organisations. It’s something that you have to take government organisations along a journey. You know, this gets into regulatory tech and how they have to understand what it is they regulate.
But they have to be involved to some extent because if they’re – you’re talking about digital keys. They have to recognise them. Something has to be recognised. That which is portable has to be recognised by an authority.
So you have to have something that’s hybrid in there and I think also public, private or – I mean so you’re getting into what I will call multilateral relationships here or an ecosystem that gets built out. What we’re trying to avoid is an over-centralisation that breaks the model that we’re talking about in terms of self-sovereign identity. You want it to be absolutely to some extent decentralised or hybrid decentralised where you don’t have an over-centralisation that occurs, because here’s the thing. This is very important from a development perspective, from a platform and principle perspective.
So that’s what we’re looking at. It should be for – even though you think it’s B2B. But I think at the end of the day, it’s going to be CB2B. You know, it’s consumer, a business to business to consumer and the consumer actually is going to be in the middle at the end. I think the consumer is going to be. I believe strongly – the consumer, the citizen, the person should be in the middle of that.
Oscar: So there will be some actors building components of this platform and then it becomes available for everybody. So for instance, what will be the – from the end user point of view, there will be an application or just a web service? So what would the individual interact?
Monique: Well, I mean you can download an API. I think that’s what it’s going to be. I mean it has got to be usable. We know that people to some extent have smartphones or mobile phones, even folks who have been displaced. They do have some level of technology, at basic level. So I think it’s going to be something that they can – should be able to download from an app store and be able to use. That would be from the user experience perspective and they will put these security knobs that they want to have on it.
They will put what they want to disclose on this, et cetera. That’s the way I would envisage and what we’re envisaging. Then we will have certain keys where they want to store the keys, how they want to store the keys and how they want to share it with what members of their friends, family, organisation.
Then of course we get into this whole lock box type of model. But from my perspective and from the perspective that we’re looking at from the design, it’s really at an app level.
The thing of it is, it’s easy to use, complexity is hidden and it’s portable. You know, when you talk to people who have been pleading terrible situations, they always had a mobile phone with them. That was their GPS. That’s everything for them, you know. So a mobile phone is like the basic tool that people should have. Well, you know, that we will be using and in some cases, there have been studies to suggest out of – particularly out of Germany that mobile phone for people in the refugee community were considered extensions of their bodies. They couldn’t do anything without it.
Oscar: And once this platform is operational again, how will it self-sustain? Because this has a cost of running all this platform.
Monique: Yes. I mean – and I think the funding. I think that you’re getting actually to funding and the business model itself which is a fair question. You know, we’ve been thinking about how that could be because it has to be something that people will want. So they will desire it and I think the funding is going to be something that we have to look at whether or not it’s public/private funding. You know, whether you – if we’re in the EU, is this something that the EU would be interested in funding or having a funding model for it?
Would it be a World Bank opportunity? So because you’re talking about one part of it is humanitarian. Would it be that? We’re looking at different models for actual funding and continued funding, right? The thing of it is, is that what we would like to have is a community working.
You know, it’s just – I think it’s a model that’s very interesting is if you look at the way Hyperledger is developed, right? In the world of technology sets and specifically around identity, et cetera, Hyperledger Aries and so on. I think when you have a community actually contributing to it and actually making sure that there is a continued contribution – it’s sort of self-funds itself. It sort of self-feeds, right? And that’s where we want to be able to go to where it becomes a standard at the end of the day. But I think the models of funding would depend and I think that you will see we can envisage multifaceted funding models.
World Bank as an example is one that comes up immediately and also some of the European Union types of projects, especially when you think about H2020 or beyond. You know, those examples. Those become interesting funding models too.
Oscar: The book The Humanized Internet is already published, correct?
Monique: No, no. It’s about to be published. So we’ve been working on the draft itself and we’re looking at – we have also – it has been a contributor. So we are about to publish it and we expect that to be – we had it noted for the end of this year. But it would probably be the first quarter of next year for sure because there’s quite a bit that we’ve had to actually make sure that we’ve updated it. Plus we want to have somebody very, very special to write the foreword for the book.
The book is actually – it’s a reflection of quite a bit of the conversations that we’re having and with personal journeys and it spawns between what’s – we have an example with folks who have been working with the – especially when we’re talking about e-government as a service. We have a colleague who’s in the entertainment space and talking about the role of let’s say ethics and also privacy particularly that which you are in control of in that space. You know, what’s happening with when – we have algorithms, when we have actually – we go out of control or not within a control. We don’t have any what I will call framework of governance in place.
So we have a – from an all multifaceted perspective and we have examples about how to use blockchain for credentialing. I mean, you know, I believe in knowing what I don’t know. I don’t know – I went off and got a degree in the space and my degree is actually potential in the blockchain.
So it’s very multifaceted and we’re very excited about it coming out.
Oscar: OK. So it’s coming in these first months of the New Year. Yeah, fabulous. Really looking forward to see that release and spread the word about that.
Monique: Thank you.
Oscar: I know one of the other projects you are involved in today is this VETRI Foundation. Could you tell us something?
Monique: VETRI Foundation. So I’m the President of the VETRI Foundation. So let me – this is one where you can actually look at how you download the app. This is something that exists today.
The VETRI Foundation was born out of a – we had actually gone through an entire compliance. We had cryptocurrency which are valid coins that were created. We are the non-profit arm of what is VETRI and what has been also procivis.ch. Very strict compliance that had gone through – at least two years ago.
The foundation was born literally December of a year ago. So as a foundation, what we have now is you have a phone. We have a mobile app and the tenet is you should be the person actually selectively disclosing how you want to share your data or if you’re a company, how you deal with market research and surveys. By the way, you are incentivised with valid coins. Valid is traded on the internet. I mean you can go look at valid cryptocurrency.
So what it is, is that rather than you being sort of the – which is something we hear about – the “product” of big companies, you are actually the middle. You are selectively disclosing and/or if you’re a company or an organisation, you’re very interested in market data and that becomes a research. That becomes very, very compelling for companies as well as consumers and you’re incentivised through the use of this token or this coin. They’re both valid.
As the foundation, we are a non-profit. We don’t take any percentage of the proceeds per se, depending if you’re a business. That is something that we negotiate. But as a consumer, we don’t and we don’t hold your data. Your data is actually – it’s frictionless. You are selectively disclosing. It’s what you are wanting to care about. Do I care about healthcare? Am I going to be willing to fill out a survey about healthcare? Do I want to know where my data is going to go from that survey?
And it’s nice to be able to receive a valid token on, or the attention that I – you know, for a minute of my time and oh, by the way, I am choosing to look at what categories of topics I’m interested in. So it’s very targeted for you.
Oscar: And for these, the organisations have to agree to use this service or …?
Monique: Well, no. I mean the organisation, the thing of it is, is you’re – it’s gamified, right? So if you download the app – I mean you can go and look at the app. You can look at it – by the way, it’s either on your phone or you can have it on a desktop.
You can see how it goes. You can see the entire video about what we’re – the foundation is about. What is very critical is we are a non-profit. So we will look attention to projects that are within – you know, looking at how we grow the VETRI ecosystem and that’s very important. We want to grow that VETRI ecosystem to spawn the use of the valid token and also to spawn the use of this engagement between let’s say data owner and let’s say data provider and/or if you’re a business, you could look at – because people want that targeted data, how clean that data is because also if the data source is not clean, the data source means nothing. So this goes down to market research at a business level.
But for the consumer, it’s very, very valuable and we believe that this is the direction, the strategic direction that the market wants to move to and certain consumers want to move to.
Here’s a thesis too. I think this is something that we always quote as part of VETRI and even when we’re talking about the Humanized Internet. The problem you have, when you have centralised data stores is that they are open for hack. We don’t see a day that has gone by where data hasn’t been just stolen.
So we have to look at ways to mitigate one centralisation of data such that – because it is valuable, that it’s stolen. So that’s where we have to look at distributing models where there is not very minor I would say hybrid decentralisation.
We would like see full decentralisation at some point in time, a distributed model at some point in time. Secondly is big companies are certainly profiting from you. So why can’t you get a share of that profit? And why can’t it be you getting not only the share but you actually controlling where your data goes?
Oscar: Yeah, great. Definitely strong reasons to have to check it out is this model as well.
As many people listening to this podcast are also in the business, running e-services or implementing for others, from the ideas of having sharing today, what’s something that people who are building e-services, maintaining e-services should keep in mind?
Monique: Yeah, that’s a great question Oscar. I think that we should have an industry model around the fair treatment of data as businesses. And of course, this is something I had spoken about in Berlin a couple of months ago, in October about how do we as an industry come together and standardise on fair treatment of data.
Of course that assumes that the source of that data is also fairly “clean”. I will put that in quotes because the problem we have is where metadata of metadata – you know, we’re just metadata.
So we as businesses – I mean especially in the business community – is looking at do we have a standard now on fair treatment of data and it’s not about reading a bunch of legal documents on GDPR or privacy laws, et cetera, such that community members and consumers and – sometimes business people just don’t have the time of day. Their attention to go through that because it’s sort of this legalese.
However, it’s looking at how do you have fair treatment of data just as you have fair trade, right? Fair treatment of data and I think having that is very important.
Here’s the thing is that it’s not about here we changed our privacy laws and you have to kind of sort of agree. It’s really getting people to have an interactive model with consumers and citizens. I think that’s more important. They need to be part of it rather than being sort of a subject. They need to be in the centre of it. I think that’s more important.
So businesses need to be careful. They need to self-state it and they need to say, “Look, we want you to be part of the fair treatment of data programme, which I think needs to be standardised as an industry,” and this is not about you being sort of the subject. It’s about you actually controlling the narrative. That I think will be very palatable for consumers and citizens at the end of the day.
Oscar: Fair treatment data, yes.
Monique: Uh-huh.
Oscar: Yeah, definitely excellent idea to – for anybody who is building e-services and platforms to consider. Monique, now I would like to ask you something for anybody, individuals, anybody how they can protect their digital identity. Could you share some tip?
Monique: Yeah. I mean that’s a great – you know, once again Oscar, you’re full of wonderful questions and I enjoy the dialogue and discussion thus far.
I think it’s very important that – be careful about your presence on the internet. It’s forever there. You leave a forensic that’s forever there. We have communities that love the – you know, it’s Instagram and it’s selfies and so on and so forth. But we have to make sure that you can never – you know, you have to be careful about how you’re presenting your digital self over the internet.
I do believe and I always quote this that if you go and get a driver’s license, you have to have so many tests and examinations. But I do believe that it’s important for companies and organisations even down to buying a mobile phone. That they walk you through the privacy. How do you protect yourself? Various privacy examples and be responsible.
I think it’s an ecosystem of accountability and responsibility. They sell you a product. They should be able to say, “Here, let us walk you through some of the basic privacy that you have to care about.”
Having dialogues like this is very, very important. I think going to just hygienics are very key. Having everything open on your phone is something that we have to be concerned about because your phone is everything. I mean it’s hackable and we’re hackable. So we have to take very care – it’s raised awareness and ask the questions around what are the basic things that we should turn – you know, where we have privacy. The two-factor authentication is an example, you know, getting into that would be an example where you start to set your security knobs in place and make sure that you have some level of the two-factor authentication. That’s very basic. These are the things that I think we need to do as an industry.
Oscar: Yeah. It’s an excellent point, the first thing you said, especially when you said that the manufacturers who sell you a mobile phone, they somehow assume that you know how to protect yourself and have some basics about data protection, which is not true. And actually some of these companies are somehow accomplices with the internet services who are misusing everybody’s data. That’s an excellent point.
Monique: Yeah, that is correct.
Oscar: Well, thanks a lot Monique for this conversation. As you have very interesting projects that are very worthy to follow and keep going, please tell us how we can find more about these projects and about yourself.
Monique: Sure. I mean look, I have a personal website at www.MoniqueMorrow.com. You can also reach us at the Humanized Internet website and also at the VETRI Global website. So there are multiple websites where you can see a common pattern of what I have just discussed. You know, I do have a Twitter feed at @moniquejmorrow and also certainly just engaging actively in the conversation through these websites would be great.
Oscar: Again thanks a lot Monique. I will be following your projects in this new year and hope that they really crystalise, go mainstream and – because they are really solving very important problems today.
Monique: Thank you Oscar and I can’t stress more that 2020 is the year of really cool projects that are going to come up and in terms of self-sovereign identity, it’s the time now. And I thank you for the opportunity to have a wonderful dialogue and look forward – we know this is an ecosystem play. We look forward to working with many organisations and companies and governments along the way.
Oscar: It was a pleasure. All the best.
Monique: Thank you Oscar.
Oscar: Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at www.ubisecure.com/podcast or join us on Twitter at @ubisecure and use the hashtag #LTADI. Until next time.
[End of transcript]
About The Author: Francesca Hobson
As Senior Marketing Manager, Francesca aims to provide valuable insights on digital identity through our Let's Talk About Digital Identity podcast, blogs, industry events and content library.
More posts by Francesca Hobson