Responsible Use Of AI + Gender Gap + Inclusion with the President of the Humanized Internet – Monique Morrow

Responsible use of AI is so important and the first topic we discuss in this great interview, where Monique shares so much value.

In today’s day and age it is so important to understand what is happening with technology and how it can help or hurt the society that we live in.

Who better to interview than Monique Morrow, she is doing amazing things in the world of technology.

If you want to visit her website to learn more here is the link Monique Morrow

In addition, check out Diversity and Inclusion in Tech to dig deeper into this topic.

Nathaniel Schooler
I’d like to introduce Monique Morrow. Monique is a Chief Technology strategist, ground breaking technologist and a proven innovator.

Monique is a former CTO at Cisco who has worked tirelessly to align technologies to society’s needs. Monique is also the co founder of the humanized internet.

I’ve been thinking about AI for a few years. And like most people, I was initially worried. But I think now I’m actually more excited about the possibilities of working less hours, which is hopefully something that we should be able to do in the near future.

Monique Morrow 1:02
AI as we know it, what we hear when we talk about the Industrial Revolution, is that anything that is repetitive, robots will take your jobs over, so will take over the job. And so there is this negative narrative that comes out and says, For example, you’re going to lose your jobs. But on the other hand, and so so you see that and say, well, there’ll be new skills that are required, etc. But that narrative can go further into information technologists, you know, people work in it, because technology is so fast, you know, how fast are you thinking, How fast are we using, our brain becomes a very, very interesting subject in itself. So AI for good AI and ethics.

I’m involved with AI for Action Group, I’m actually co-chair of the extended reality committee. We’re looking at a narrative that discusses it’s very open to people, whether you’re a roboticist or a filmmaker or whatever it is, you’re doing, looking at how we use these technologies, that covers the gamut of privacy covers the gamut of policy to have a discussion the industry about that that doesn’t necessarily be that is dystopian AI could be used, you can have this tool that can be used to diagnose your own diseases, you yourself can become the center of that panic and revolution. Which is a good thing, which is a very positive thing.

The other part of it is AI taking over jobs. I strongly believe that AI can be a stimulant, for job creation. For example, one of the narratives that we talked about in this book called ‘The people centred economy, the ecosystem of work’. Which was published in November of this past year. Is, could we imagine something that’s nascent, that you can create with the use of artificial intelligence, a job as a nascent job as a service for you. So with inputs that you would provide, you would say, I want to work in this geography, I want to earn this amount of money and I want to work with these types of people, you could use this technology to create some feedback to you and say, hey, look, we have job the perfect job for you. Of course, it involves an ecosystem of players, which includes private industry, it includes government, also, because they want to be be part of this, and who do you disintermediate, and by the way, you could pay attacks without, or you pay something for that service. And you could disintermediate, if you will, the unemployment office. Because if you can guarantee somebody that you’re opting in for, I kind of look at it as LinkedIn on steroids, if you will. But it’s reverse, it’s very, very interactive, you would be provided for the job, it’s tailored for you, it’s a tailor job as a service.

Nathaniel Schooler 4:15
Right, so what you’re sort of saying is that this platform technology, AI powered technology will slot in with people’s personality traits, their current skills, that mindset, their motivations, the location they want to go to, and the amount of money they want to earn, and then it will end their past career history and experience, and then it will, and then it will go out, and it will search for the jobs that would actually suit that individual and help them to up skill to get to that. Is that what you’re saying?

Monique Morrow 4:51
It’s very much that direction. It’s information you choose to share. I think that’s really important, I want to work with these types of people, etc., because what’s a very, very important is the privacy issues. You don’t want something that’s held by a centralized platform. I think what you provide can disappear, when you think about that disappearance of data. You should be able to get an instantaneous feedback and then you could see that the data has disappeared, or something to that effect. We have to honor the privacy, regulations. Plus, we’re looking at this platform, we call it “Joblee.” If you look at a “Joblee”, it doesn’t hold data, and that’s very, very important for you.

Monique Morrow 5:38
Now, the thesis here is that what you now have is something that is very interactive to you, which is very new, you can see what is disintermediated, at the end of the day. The government gets involved, because maybe you pay, maybe that’s a tax you pay for, as a service, we don’t know. For example, in the Netherlands, you wouldn’t be paying for such a service .

I think this gets into regulatory tech policy and I think this gets into the sort of the ecosystem of players that we have in this realm. So that thesis about using “Joblee” to create something for you, nascent job as a service is something I think should be possible, we believe it should be possible with technology. But, we also believe that shouldn’t be a zero sum game. We should we believe that people, no matter your age, no matter who you are, when they are, when they opt in, to participate in the job, they should be able to have something that’s a nascent job for them and give them the inputs that they have. So we think that that’s important, and it’s a possibility.

Monique Morrow 7:09
I first talked about this at the “Joblee” at the web summit in Portugal, this past November. In fact, it was interesting, I was in a summit I was on a panel with people from the EU from the macro and government, from various governments and because EU has a big boat this year. I think people were very intrigued because this is a Copernican revolution is about putting people in the center and not having things happen to them. So that’s AI job creation.

Monique Morrow 7:43
AI for good is also, what we hear often is the dystopian this polarity between weapons of mass empowerment, you know, I gave you an example of, you know, being able to diagnose your cancer, self diagnose, etc, to we’re not far off.

Nathaniel Schooler 8:10
If you think that the Iwatch number 4 has already got an ECG machine on it as well.

Monique Morrow 8:17
Here’s the thing that becomes interesting I would like to disintermediate the way health care is done and you have the NHS and the UK. I was given a hypothetical Korea, imagine paying for healthcare insurance $1 a month. What would that look like? Is it possible? Why can’t we have a moonshot like that. I mean, we put people on the moon, the moon shot. The reason that you have to think about moonshots in this space is where technology could play or not play is, I believe we’re still in the 19th century, especially when it comes to, ageism, I think people when Bismarck said, people should be retiring at age 55, 60, 65

Monique Morrow 9:05
People were barely living to 55 and so we have not come out of that. We’ve come out with it with prejudice, this isn’t discrimination. Even in the healthcare system, that is a lot of sort of gamified, I’m in it against you. So I think that we need to think about if we look at using technology, the watch examples you gave. If I’m a good citizen, and I’m sure I’m sharing my data, selectively sharing my data with an insurance provider, I expect my policy to go down,

Monique Morrow 9:39
I want my policy to go down. I don’t want this feedback to say, well, we’re sorry, that’s government regulated. Well, think about new new paradigm shifts, and in these discussions, so that’s an example of, Hey, we could be monitoring our healthcare, we could see whether or not we’re open. This gets into genomics into targeted smart medicine, we could actually see or experience whether or not we’re, we’re going to be subjected to Parkinson’s disease, or dementia, Alzheimer’s, and so on. We should be able to see that, and to do something about it. That’s the power of targeted medicine.

Monique Morrow 10:00
So this is the power of technology at its best, especially when we’re thinking about, artificial intelligence, we think about Watson’s healthcare example, by IBM. So we know that technology exists. Citizens have to become part of that central set, we’d have to be central to that universe.

The other component of it, however, is that we have to have a governance model, because then you have the polarity with this mass empowerment, I can take care of myself, I can look how I can get a job as a service, and all of this kind of wonderful stuff, but there is also the governance model is mass destruction.

This gets into the singularity, when robots take over with this gets into this. So, this sort of thing about robots having over superhuman mind, they have a mind of their own, etc,. The stuff that you see on film, or actually stuff that is occurring now, which is predicting future crimes, this isn’t more than the Minority Report, this is stuff that is happening now.

Nathaniel Schooler 11:44
Oh, it’s happening. It’s actually fantastic. If you analyze someone’s behavior patterns based upon a video that’s being played live, then an operator can be alerted to that. Yes, and an airport which prevents destruct as well.

Monique Morrow 12:22
Then you have to look at the thing that becomes problematic here is, who’s watching whom, I mean, you get into dialectic with no surveillance, surveillance, society, etc, that are received in some countries. For example, it is no surprise that China and the Russian Federation have gone out to say, the country that controls artificial intelligence, or has led and artificial intelligence will rule the world. So, you have to look at what that means.

Monique Morrow 12:55
That already is a statement at a state or a political level. So, here, you can get into, surveillance. We don’t have to look far back in history, I always, always give this example of views, we don’t have to look far back in history. When you had Nazi Germany and the SS, working with what was part of IBM, the Hollow graph populated to take a census, the punch cards. They knew exactly where to pick up people based on their authenticity, their religion, and so on.

Monique Morrow 13:20
This was very targeted technology that was used. So we have to, that’s why I said, we don’t have to look far back in history, because it is looking at how governments, ‘who’s watching whom’. So a governance model has to be kind of in place here. And when you democratize tools, which is very powerful. Then we have to be careful of how citizens use these tools in a responsible way.

For example, if I don’t like you, is that spider in your shower, a spider? Or is it a very small drone with anthrax on it? Now, we can sit back and say, Well, you know. There are things that we have to pay attention to, and we have to as technologists, and this is my point. You have to actually declare the intentional use of that technology from the very beginning, just say, a pack of cigarettes cause cancer, you can say, this is the intentional use of technology. This is not to dissuade from research in no way at all. But you have to say, this is how we intentionally use it, anything that causes like we’re not aware of, but you probably want to be careful something to that,

Nathaniel Schooler 14:40
I think it’s very difficult that when you consider that you’ve got China they’ve invested a lot more money than the US. So a lot of the test cases around this is what I was reading the other day have moved into mass production.

So the US now need to catch up. I personally think that probably over invested a great deal so that they would have wasted a lot of money, because a lot of the things will be like, Well, did we really need that anyway. It’s like a lot of things like that would just for tech sake, it’s like, does that actually helped anyone and you’ll find that I think a lot of those cases will just have been a waste of the Chinese government’s money.

Nathaniel Schooler 15:21
However, what I do think is that the US is going to catch up. But the major issue that we face is the unconnectedness of China, Russia, US and anyone else in the world and actually they’re not going to stand up and say, ‘Hey, yes, we’d love to be part of your organization that’s AI for good’, because they don’t care. All they care about is continent and their country. I would love to see it. I’m not being negative about it. I’m just being kind of raising an important point.

Monique Morrow 16:05
So what the geo politics that is occurring, it’s very important to note that, we have to think about what are the relevant political system? What should they look like, we could talk about regulatory tech and what is the background that we should have.

I was in Dubai with a group of 100 over 200 specialists in this area, talking about artificial intelligence and Dubai has as an AI minister. I think that’s very key to thinking about, what’s the background that people should have and what does it look like at a level of government? For example when you’re elected official. Do you elected official because they have an AI background or a cyber security background you will get an official usually on on the platform they’re running for you know, I want my education or something to that effect.

Monique Morrow 17:02
I think what we’re going to see is that we’re seeing more and more of the intersectionality between social science, political science and computer science coming together as a discipline. So you need to reach those people as government. This is what we mean by regulatory tech government officials are going to need to have a background in this space and they’re going to need to have a platform on issues around AI and it’s used cyber security is huge.

Monique Morrow 17:28
I just came out of a Cybersecurity days in Switzerland, where I was a master of ceremonies and I learned a lot. Governments are looking at when your state attack, you have minutes or seconds to respond, you don’t know what the attribution of that attack is. That’s very, very disconcerting. That’s looking at how do you train medium to small businesses to be secure? Do you have a certificate for security in this space to say, you always keeping yourself secure? And if you don’t, you have to pay a fine. These are the types of things that, we have to talk about and the society that are evolving, too.

Nathaniel Schooler 18:13
Yes, well, security is massive isn’t. It underpins all technology and that’s the major thing. We’ve got this digital ecosystem, people like to use this sort of term. I’m not a fan of it personally, but it underpins all of it without without security. It just becomes a threat waiting to happen.

Monique Morrow 18:39
Somebody walked us through in the gist of the day a lot, because it’s all related. Machine learning and what smart cities and Internet of things and all of that hanging of information. You have to have cyber security and cyber defenses as a part of that discussion.

Here’s a narrative. You wake up one day, Nathaniel, and your mobile phone doesn’t work, and the streetlights are going off and on and you don’t have any power. Your energy sources have been attacked, the trains aren’t working, the subways aren’t working, the tubes aren’t working, and you the hospitals are in dire shape. Nobody can call anything. That’s one of the tank and look like a massive attack.

Monique Morrow 19:30
Stock Exchange is just going quickly and who are you going to call because you can’t call. So, this is sort of the massive way it can look, and it’s not necessary to scare citizens, but it’s for citizens to be very alert about what are the policies that are going to be placed by governments, by the military, and how training people just on basic hygiene, acts of obstruction carrying your platform and so on.

Nathaniel Schooler 20:04
It reminds me of a trip I took to Belize. I went to get married in Belize back in 2011, and there was a competition for Miss Central America. So all these women are walking around in skimpy dresses. Then and then we got out of the airport after we talked to them and we went to the cash point machine (ATM) and we tried to take some money out of the machine. and there was no money. So walked round to the bank and bumped into some Belizians who said ‘No, man, you can’t get no money for the phone is off’.” They were on strike for three weeks. I kid you not, you could not get any money for three weeks. Fortunately, we bumped into like this lovely Christian lady and her daughter.

Nathaniel Schooler 21:01
We jumped on a boat to Caye Caulker which is a fantastic little island on the Caribbean, and she lent us the money. Then someone gave us cash-back on a Visa card from a restaurant to pay her back. Then we just hung out and relaxed. It’s like that stuff can happen. It brings you back to what we really are, and that’s humans. This is forgotten in today’s society. I mean, you have people who who just live for their Instagram likes, they live for the Facebook messages and everything else. They don’t know how to have a conversation face to face in the real world. We’ve lost touch with compassion and empathy in many ways, and it’s because I think we can’t deal with our feelings.

Nathaniel Schooler 21:53
So for example, if you see someone on the street, a temporarily homeless person, let’s say, because they are all temporarily homeless, right? I mean, as much as they’re in that situation at the time, they could be helped out, but you see them there. A lot of the time you just ignore them, because you can’t deal with the feeling that it gives you. Cope with with this dire kind of compassion and sympathy for someone that’s kind of down on their luck. It’s really unfortunate. That is just a byproduct of society in general and our lack of caring these days. The churches used to used to treat homeless people better 100 years ago than they do now.

Nathaniel Schooler 22:46
As a society, because churches have kind of retreated within their own environments, because they don’t want to upset anyone, and we’re just obsessed with political correctness as well.

Monique Morrow 23:04
When was doing the right thing a wrong thing? So all of this, the humans should be in the loop because they have to interpret the data that’s presented. So if you’re going to push a button, you better know what you’re doing. I always talked about algorithmic decision making and human rights as an example. We have lost empathy. That’s one thing at the loss of empathy is something that I noted as a trust loss, but I’m also looking at an empathy loss. I think that’s where we have to gravitate to what is social good, what is good to look like. We’re humans, we have to have that empathy part of us

Monique Morrow 23:46
As far as education is concerned, we have become so dependent upon the tools that we have. There’s a sticker that I have, ‘Googling is not research’. I remember talking to a chancellor of a university who said, I just want students to ask a question. If you cannot ask a question, if you cannot get into this discussion about right or wrong, and you’re constantly having the phone at your hand and that you’re so dependent upon it, then you’re not able to think anymore.

Monique Morrow 24:15
That depth of thought and discussion is so lost upon us. I will argue that we’re coming back to fill that deficit, because that deficit of empathy, of listening of posing questions, especially when we talk about ethics. It’s no surprise that philosophy is coming back and trend as a discipline. But we have to come back to the humanity, what makes us human. I think that’s an important point that you have articulated Nathaniel, which I’m very, very supportive of at the end.

Monique Morrow 25:03
The other part of this discussion is, going back in job creation of robots taking over your jobs, and you were talking about nascent job as a service. Then the discussion is, should robots be taxed? These are big discussions that are occurring, because after all, they took over your job and people were throwing other work.

You have to be give people the chance to up skill and the other point I want to make going back to humanities, humanity, and the work environment itself is, we have become so toxic. This maybe is tangentially related to technology, maybe it’s not. But the whole idea of putting people to work, we have to be very, very careful about.

Monique Morrow 25:50
I think that you could do some creative things like for example, say, you’re the founder, we have a deal for you, we’re going to give you a package a different package on certain conditions, you’re not going to get a bonus, but you’ll go away for two years, maybe you’ll teach in your village or your city, or maybe you do something that has a social good part of it, because we need teachers or whatever, but you still have the dignity of work. You still have a salary and you’re doing something good for society. Also, maybe after two years, we’ll revisit whether or not you want to continue, or whether or not the group that you’re working with wants to go further.

Monique Morrow 26:33
I think we need to be creative about how we treat people as, quote unquote, resources in companies rather than just numbers. Looking at what we’re all called, purposeful, innovation, rather than just looking at pleasing the stock exchange. I think we have to be very, very careful for benefit companies vs revenue type of companies. I will argue that for benefit you’ll probably are more. There’s some arguments for that and I think that we’re seeing that more and more in empirical evidence that suggests that is true, that people are moving for that.

Nathaniel Schooler 27:22
Yes, the other day, I was talking to Dr. Churchill. He’s the founder of American angels. I don’t know if you know, of him. But super interesting guy. He’s been involved with over 1000 startups. He’s created 10’s of 1000’s of jobs with these startups. He’s got a process of angel investment, mentoring funding, the whole works. We had this long conversation and he’s the first person to reiterate what you will agree with. So let’s look at the world and let’s look at consumerization.

If consumerization dies through the use of AI, the reduction of jobs and companies making money without this social good element, without having having to contribute towards the society and the people who cannot work because they’ve been disrupted or they don’t want to work, the entire wheel of consumerization dies.

Nathaniel Schooler 28:24
So those companies in essence themselves die and he’s the first person to raise that with me. I’ve been thinking about this for years, since I sat around this ’roundtable’ with IBM back in 2015, with the editor of wired, and some really great people, IBM, really high up people. I was just sitting there and they were droning on and on, like this chap from wired. I was like, well, ‘what you’re saying really doesn’t mean anything, you just speaking for the sake of speaking’.

Then this is the whole problem that I’ve got. Firstly, Google, right? If you want to research something, you talk to people and listen to content from people who’ve either done what you want to do know more about it than you’ll ever get to know because they’ve got 50 years experience.

Nathaniel Schooler 29:17
Instead of looking at some result that Google has adjusted based upon an algorithm that is, in essence games, because SEO people are great at SEO. If an algorithm it doesn’t give you high quality information, it gives you junk that is generally there for you to buy something, because otherwise, why would that piece of content be there in the first place?

Monique Morrow 29:43
That’s true. I would go back even to when Co Founders of Google said, Look, do no evil. Well do no evil is the Hippocratic Oath at the end of the day. It’s really about that. I think we have had a strange dependency on these tools. We’re lacking the conversation instead going back and forth having these conversations in depth. The thing about loss of consumerism, that hypothetical that was put to us actually was a very strange one and when we were all in Dubai, and that is, imagine 20 years from now that the world has 50% unemployment. What do you do? 50% unemployment!

It was very interesting, because the Chinese Professor next to me said, that’s a great thing. They don’t have to work. The lady who was a nuclear physicist from Kinshasa said, ‘well, that’s a concern for us, because if the developed world, and that was her view, constitutes developed and developing world is 50% unemployment, then we’re going to have a terrible effect in Africa. Africa has 54 countries, right?

Monique Morrow 30:57
So she saw it as very good dystopian. One lawyer fundamentally said, I guess that will probably make the policy towards assisted death, which is extremely dystopian. The other one was more, well, let’s make sure we colonize Mars very quickly. The person who was actually sharing that his name is Calvin Chase.

Callum is an author of the book The Economics Singularity. Basically what he was talking about as AI the death of capitalism. So he had an assumption that economy as we know it, and the measures the economy would be breaking. Would that mean, if there was 50% unemployment that you’d have to reduce the cost of living? You’d have to have different measure of economics. So the economic economist in the room were very uncomfortable with that discussion.

Monique Morrow 31:56
So, but it’s something that we need to think about, because the thesis here is that, institution says, we know what’s breaking. We need moonshots. I don’t think to your points that you’ve made earlier, we should be asking for those moonshots. Could we imagine zero unemployment? Could we met with these technologies? Can we imagine dignity of work with these technologies?

Could we imagine not having a governance model around usage, responsible usage of these technologies? And who’s, who’s watching whom, etc. You’re spot on about one thing. I don’t imagine dates coming to the table and saying, we’re going to nuclear proliferation or reduction, we’re going to use these technologies in a responsible way, because nobody will agree on what responsible way is.

Nathaniel Schooler 32:55
But also, let’s step back a minute, before we move on, because I know what I want to get to in a couple of minutes. So what I was thinking was, what about the definition of work itself. I was talking to someone, I had a thread on my Facebook the other day. I was talking about unemployment and how I think it’s at an all time low in the UK and blah, blah, blah, and everything else. That there are still X amount of people who are not working, he actually raised a good point.

Nathaniel Schooler 33:32
He wrote, ‘Well, why do people have to work’? So I thought… Well, actually, you don’t have to work but you have to enjoy doing what you’re doing. If you can make money, and you can still enjoy what you’re doing. Then the definition of the word ‘work’ is completely wrong anyway, because actually, it isn’t work it’s enjoyment. When you find that enjoyment in your life, you should get paid for it. Right?

Nathaniel Schooler 33:58
I’m talking to you and I’m building a few different things. I’m getting paid something that I love to do. I love to talk to people, I love to reach out to them on social media. I love to use social media. So then it redefines Firstly, social media platforms, I mean, there’s a new Blockchain powered social platform coming, which is, going to give 50% of the revenue from ads to the people who are posting the ads. You’re in full control of your data, so you don’t have to share anything if you don’t want to. Those sorts of things are going to destroy a lot of revenue that’s going into Facebook and everything else. However, it’s still disturbs me that people could think, why do they have to work and I think the word ‘work’ is wrong.

Monique Morrow 34:51
I agree. So you just burned another topic here, you should enjoy first and foremost. You have to enjoy waking up in the morning and doing something that is a value to you! So what I’m talking about, how we can bring all of this stuff that brings value to you and not you bringing value to something else and there is some kind of compensation model, you wake up, you’re excited, you’re compensated for it. We live in a society that says, you do have to pay your rent and everything else, but you’re compensated for because you’re doing what you’re doing, and you enjoy it now.

Monique Morrow 35:04
Nathaniel what came out also in Dubai, as a study on well being and policy. We found is people are burning out because of the toxic environment, and so called work structures. So having policies about well being is very important. I want to wake up in the morning, I am excited about this, I’m not nervous about am I on the layoff list or whatever list. I’m excited about being the people I’m working with, or you’re enjoying what I’m doing a lot. I absolutely agree with you.

Monique Morrow 36:13
We have to bring that back into the definition of quote, unquote, ‘work’. Where it’s values brought to you because of what you’re bringing, counter value to society, but also that you’re compensated for it, and you enjoy it. There should be an enjoyment aspect of what it is we do. That’s why we get into the health issues, burnout, overweight, stress, smoking, and you name it, diabetes, and all that kind of stuff. People are just so challenged, and they’re so insecure. This is about bringing safety in your mindset, because you don’t have to worry about that anymore. Removing that from from now on the equation. Totally agree.

Nathaniel Schooler 37:06
There’s also another fundamental problem that we’ve got with technology and with people. It’s actually over ambition, and over ambition causes a lot of suicides. You’ve got all sorts of different problems we’ve got, we’ve got over ambition, which is like, ‘Oh, yeah, I’m going to go and build the next PayPal or the next Cisco. It’s like, ‘Look, so your brain out the chances of you being able to do that your chances are, you’re going to get run over by a bus five times or 20 times before you’ve even managed to build anything that’s attempt that size’. That’s the first ridiculous thought that we’ve got to get over.

Nathaniel Schooler 37:48
There’s nothing wrong with ambition. I’m really ambitious and I’m also enjoying the journey. The problem is that people want to build a business and they want to build a career and they don’t enjoy the journey. That is creating massive issues. We’ve got technology that was, we were promised, shorter working days, we were promised a better life and more money. Where the hell is it, right?

Nathaniel Schooler 38:17
You have to work 10 times harder than you’ve ever had to work in your life. You’ve got to manage your time really effectively. You’ve got to be really good at what you do, or you haven’t got a hope in hell of getting anywhere. It’s just an insane world.

Monique Morrow 39:20
I think we need to email ‘Wouldn’t it be lovely to create a new kind of model’. I think that’s the, opportunity I for us all. I also agree with you that this, blinded ambition about I’m going to be a billionaire by the age of 30 is ridiculous. With it goes, I will squash you at any moment that there’s winners and losers and my argument is, we don’t have to speak in winners and losers language.

Monique Morrow 39:52
The other thing is the people talk about fail fast. Fail fast, is not forgiving for people who have experience because they’re judged harshly and it probably gets us into the next model, and certainly not for women. So I think that we need to think about fail fast can also be very, very disconcerting when you’re talking about systems that are critical infrastructure type systems, want to just mess with that.

Monique Morrow 40:24
So we, we need a new model and the new model is, is is this new social contract. It’s a new social parameters, like bring humanity back to technology This is around bring humanity back to society. This is about not haves and have this shouldn’t be about haves and certainly should not be about winners and losers. I will argue, we don’t have to talk about a zero sum game and this new social contract.

Nathaniel Schooler 40:50
I agree and if you also look at population growth. By 2050, the whole population of the world will have completely changed. I was looking at some stats that’s the other day, I think you’ll find that Africa is going to be number one. India is going to be number two, and China is going to be number three, so we’re going to be subservient to these massive, massive economies. Everything’s totally changing. It’s totally changing. I think it’s actually quite exciting.

Nathaniel Schooler 41:19
The major issue I have is with these massive life coaches that say, you need to step out of your comfort zone, it’s like, well, actually, if stepping out of your comfort zone is your comfort zone, go ahead, step out of your comfort zone, but stop encouraging other people to be scared and stressed and everything else in their lives because they don’t need to do that to actually progress and do something they enjoy right now. I have major issues with that one.

Monique Morrow 41:48
Yes. Well, I mean stepping out a comfort zone doesn’t mean that you have to be scared about it. You can be comfortable about what it is you are doing insofar as that comfort is not reinforcing negative, what could be negative within the society itself. I think it’s more about, being able to challenge yourself and what is a ‘challenge yourself look like’, right? What does it mean to step over the cliff and say, ‘Okay, maybe I want to reinvent myself in a new way, maybe I’m not comfortable with that’. Sometimes, those are the types of dialogues I myself have been involved in. So what happens is that people will sit there and say, ‘Well, you know, I don’t know I’m not getting this’.

Monique Morrow 42:42
So this gets into mentorship and coaching overall. I mean, people think coaching is about finding your job. That’s not true. It’s about reflecting on yourself. I think we need to take some time to look at that and have those self reflections about what do I do that will be an improvement for me, and value for me? I’m a person that believes that education is lifelong. I don’t care. I’ll probably be 99 years old, if I live that long, and just go on to the next thing. I just believe it.

Nathaniel Schooler 43:20
My dad’s 85. He would love to meet you. He went to MIT. He graduated in 1952, or something. His dad went to MIT as well. He is just learning how to use his computer right now and he’s been listening to my podcast and he just gets it. It’s having an open mind and just continue learning. The funny thing is, I was exactly going to finish this topic on that specific point.

Monique Morrow 44:01
I think that because we started out with it and diversity inclusion. Let me just address the one issue around inclusion, and then go into also into continuous learning. I think inclusion is a massively interesting discussion. The way I like to put inclusion, Nathaniel is the following. If our companies and our organizations don’t reflect the society that we have, then something’s wrong. They don’t reflect your customers. If they don’t look like, I had people say, ‘I remember when I was in private industry walking into a meeting, and I was the last person in, one night and the person who was a lead for the customer said, I was getting concerned, because the people who were talking to me didn’t look like me’. So you have to think for one minute. If you’re building stuff, if you’re looking at who’s involved in those teams, how do they look? I believe if we get this, right, we don’t need diversity officers because it is so automatic in the way our DNA operates. I think that we are diverse group of mindsets, a diverse group of people, whether by ethnicity, gender, etc.

Monique Morrow 45:28
Now, in technology, unfortunately, we’ve been seeing a lot less than that field. In fact. You can do the measures, if you go look at the top 50 companies that are actually led by female CEOs or people of color or whatever, you can see some decline here, which is a concern. If you look a technologist and women in tech, I always say women in tech, or people of color in tech or whatever, you can see some some some concerns that are kind of red flags. I think this is a complex topic, but it all starts with the home with who does the dishes, who takes out the rubbish and all of that. I think that’s the thing that we have to look at is, how are we as families actually bringing up our children. Steps that they have to first mindsets, that they’re not relegated to duties that are typical for boys, or typical for girls. I think that’s, that’s a big, big topic area. The very last thing is about education and training.

Monique Morrow 46:38
I’m finishing my third master’s degree. I strongly believe that education and training and upscaling is not a responsibility at some company or some organization or some government. It’s really your responsibility. Yes, they can help with regard to funding, but in the end, it’s your responsibility. Now, I believe that the thirst for knowledge should continue. I think we should always be posing questions. We should always be looking at how is this stuff affecting me. We should always be thinking about the what ifs, and it doesn’t matter what your age is. Lifelong learning is something that I believe we should be striving for it at all times.

Nathaniel Schooler 47:29
But with the gender gap. The most important thing is, equality, no doubt about it. The business is bigger than that, isn’t it? If you look at demographics, and you say, ‘Well, okay, so who are we going to serve’? Every business serves someone, they serve a demographic. For example, you might serve a really broad range of demographics all the way from very, very young all the way through to female, male transgender, lesbian, gay, bisexual, whatever, it doesn’t actually matter. Is that you’re serving individuals within no specific demographics, right? So that means that you have to have people that are on your team that are within those demographics. Otherwise, how do you know what you’re doing?

Nathaniel Schooler 48:33
First of all, is this working? How do you know what you’re doing is right? How do you get your messaging right? How do you get your product right? How do you get your relationships working? If I’m completely different to someone, it’s very difficult for me to have a conversation with them. I talk to just people who are lifelong learners. I don’t really have conversations with people that are. I talk to people who are enthusiastic and not necessarily extrovert, but they are outgoing, in some respects. We all seem to fit a very similar mold of a person of an individual that, likes big picture thinking doesn’t really like the details. These are the kinds of personality, I would say, disorders that I have.

Monique Morrow 49:22
I think it’s the thing we tend to tap and people who think like us, and I think that is just around cognitive bias. We’re all infected, we have to be able to recognize it, respond to it. So, that that’s the reality that we live with, we’d like to like minds. That’s also kind of a red light. We probably want to think about how to get the big elephants in the room to ask the question. So maybe I’m not an agreement, but we need to challenge one another. It’s not comfortable. I think that being uncomfortable is something as organizations we should strive for.

Monique Morrow 50:05
The other thing is, I believe in servant leadership. Servant leadership is lacking. I believe, when I go through door, I’ll talk to the person who’s collecting the rubbish the person is serving the table. I believe in that because that’s a round servant leadership. It’s not about who’s at the board level or anything else. These are all human beings. I’ll tell you, there was a Harvard study, I think, maybe around eight or nine years ago. This question at Harvard Business School was I think was 90% worth 95% of the year grade. The professor asked the question; ‘There’s a person every day at 4.15pm, when you leave the class, there’s a person sweeping the hallway, what’s the name of that individual’?

Nathaniel Schooler 51:01
Wow!

Monique Morrow 51:09
If you cannot name that individual, I have failed you. That’s the point right? When we talk about all the sets of technology, but this is about servant leadership at the end of the day, which is what we need to bring forward more and more in our organizations. I’m not talking about a sweeping of hypocrisy, but I’m talking about the person who was serving you, but also you need to be serving that individual the right way to honor that individual by asking questions about how are they doing? I mean, what is their name? Where are they coming from? They have a story, we talked about homeless people, this is about that, serving others.

Nathaniel Schooler 51:57
Of course I share your exact opinion on that one. Literally, I talk to pretty much anyone and I find it really important. It’s so important. I go to church every Sunday and sometimes I’ll go have a coffee with a few of the people afterwards. There’s this lady who’s cared by the community. She comes to church with three teddy bears and puts them next to her, and she’s kind of off the wall. I don’t know what to talk about. Then there’s a lady, a manager of the front desk in a Travelodge, real normal people, you know, great people. What I found really quite interesting is exactly what you’re talking about.

Nathaniel Schooler 52:57
So there’s an accountant who runs an account business. He owns an accountancy firm and he’s always giving out the service cards at the beginning. So I was invited by the lady who’s the Travelodge receptionist, I really like her as a person. She’s very clued up. She’s really intelligent. She wants to do something different in her life and she’s open to learning. Those are my people. People that just have a totally open mind.

I sat down with her, I sat down with this crazy woman who’s got hairy legs, and she’s wearing shorts, and I’m thinking I hope she doesn’t get a teddy bear out. Then another guy who’s cared by the community. He’s been in mental health for 40 years, had electric shock treatment and everything. Then one guy who’s who’s kind of, bordering on, he was homeless, but he’s sort of, bordering on I think he’s into drugs still and stuff. But he’s quite a nice guy. Actually, if he got away from the people he’s hanging out with, he’d be really successful.

Nathaniel Schooler 54:11
So I sat there, and I looked at her and I looked at him, and then this accountant guy walked past me and he just, and I said, ‘Oh, hi, how you doing’? The lady invited him to sit down and he just looked at all of us, like, we were just trash. Yeah, he looked at me, like I was a loser for sitting there talking to these people. I just thought to myself, ‘you know what, I don’t want to talk to you anymore’. I have no time for people like that. If you can’t be human, and you can’t talk to people and actually give a monkey’s about them then, gender gap is an irrelevance completely, because it’s nothing to do with gender, it’s to do with humanistic values of what sits behind people as individual, whether you’re male, female, transsexual, whatever, if you can’t communicate with other people, on their level, be genuinely interested in their lives, their opinions and what their daily problems are in life.

Nathaniel Schooler 55:16
You know, it reminds me of NASA. Okay. So someone walked past the cleaner, but they walked back to ask, what do you do here? This person said, ‘Well, I’m just putting a man on the moon, sir’. That’s the whole point. Whatever nationality, color gender. You need to be inside that business completely attached to the mission of that business and the corporate social responsibility of yours, and the business because those now are becoming harmonized entirely from what I’ve been learning. So yeah, you know, it’s fascinating. Isn’t that great?

Nathaniel Schooler 56:14
We’ve got to get to this next topic. I reckon we’re ready for Information management, what you think? Over to you, really, because I’ve never worked in a big business, you’ve been chief tech officer at Cisco. Tech is ingrained in you. I only started learning about tech about nine years ago, when my friend Eric said to me, ‘what are you doing, man, you need to learn how to use a computer properly’. So after about a week trying to use this computer now, which I would just throw out the window drive over with a with a car. I said, ‘how do I do this’? And he’d say, ‘Oh, this is what you do’. Then a week later, I’d send him a message. ‘So how do I do that’? And then he said, after the third message of me, asking him, he said, ‘Google is your friend’.

Monique Morrow 57:15
I always say I’m an accidental engineer and, coming into this space, I think, because of the curiosity of technology, and how, I got into, Chip, some stuff like that in the late 80s. I think that working for a tech company, because I’ve worked for a tech company. So that’s your point, it brings in different sets of dynamics, and that tech companies are very concerned. They do talk about, zero sum games, winning and losing and competing and being one or third or fourth. One, two, three in the marketplace. What certainly has been an issue what had been an issue when I was at Cisco is the startup community, you know, who’s going to eat their lunch very, very fast.

Monique Morrow 58:37
So I love now I’m in startup communities, pretty much so, and so I haven’t have deep appreciation, and we have an very interesting inclusion, diversity, mindset, etc. I think it’s just so fun. I think the thing of it is that it was always a dialectic between when do you eat your old, to grow your young? I mean, when do you cannibalize your services? If it’s, if it’s still working? Why should we worry? When do we take new pivots in the industry? What does that look like? Especially, you have to have people still maintaining the business. But the thing of it is, is, as you’re looking for new pivots, people think that’s a cool factor and they want to gravitate toward that. They’re not maintaining the business, and how do you reward those people who are still maintaining the business?

Monique Morrow 59:16
So but certainly we worried about and this is typical, a very established companies in even the ones that we’ve been talking about Google, Facebook. How do you maintain that leadership? They’re constantly looking at startups. Why is that the case? They’re agile, they’re going faster, they could present a potential competition for you. So what happens typically, if there’s a match, you typically acquire them, so you start eating them. You have to ask yourself, Is that a good thing, because you have to look at what is a cultural fit.

Monique Morrow 59:56
Now, these are people or organizations that have been used to help and work with one another and all of a sudden, they have been sucked up by the board. I have met quite a few people who said, ‘You know, I can’t wait to get out of this, being sucked up by the board. So this goes into a very hard discussion. Which is, what this culture looks like, and these organizations. What should it look like, and these organizations that are very big organizations, 20,30, 70,000, 100,000 people, over 100,000 people if you’re looking at IBM, what should that look like?

Monique Morrow 1:00:33
People what is ‘feeling safe’ look like? How is fast look like? Could we imagine what is imagination look like about what our future could be and what our customers going to look like I was always contrarian. I would always say our customers today may not be our customers, but tomorrow. So all of these dynamics and factors, but is it fun, it can be very fun.

Nathaniel Schooler 1:01:11
Maintaining your existing infrastructure, your IT infrastructure, and the security of it. That’s the basic. Then integrating other products, for example, is kind of what you do as an innovator and as a board member. You have to move forwards, otherwise, they’re going to eat your lunch, because it’s the natural evolution of business. It’s been going on for hundreds of years.

Nathaniel Schooler 1:01:43
In the UK, my mom’s side of the family, they were in a brewery. We had the eighth largest brewery in the UK back in 1800. Then a brewery would start, it would grow, it would buy pubs, and then it would make beer, and then it would grow, and then it would sell the business or it would buy another beer brewery, and then it would grow bigger. That’s generally kind of what’s been happening in it. But as far as information management is concerned, that is underpinned by the security of the business, and it should be, shouldn’t it?

Monique Morrow 1:02:21
Absolutely, yes, I was going to say early in my career, I was involved in doing network management. I think what’s funny is network management has now become cyber security. Nobody wanted to be involved in network. Network management is always last thing people cared about in the days.

If you think about it, managing the security of infrastructure should be table stake will stop. It should be table stakes, because so much is at risk. When we’re talking about cyber security, it shouldn’t be just some group doing it in a company. It should be at the board level of responsibility. It has to be at the board level, because when something happens, when stuff hits the fan, as we say, right, in a nice way. We call that a very, very big factor of loss of trust, and somebody will fall on the sword and it’s not that group. In some organizations in the company, it’s going to be at the head of a company and the board members, right? That’s the startup stuff.

Nathaniel Schooler 1:03:30
Would you also say that to manage information?

Monique Morrow 1:03:37
Yes, it’s everybody’s responsibility. Absolutely. That was my next follow up, is that cyber security and network management is everybody’s responsibility. When I was at Cisco, we had a very interesting program, where they kind of gamified is like a Jiu Jitsu program. Pass a test and you have a purple belt up to a black belt. That’s because the whole program was everybody’s responsibility in the company. Whether you gamify it and create certifications around it, that’s fine, but it is everybody’s responsibility.

Nathaniel Schooler 1:04:20
So information management, is basically everything to do with that business. Without information a businesses isn’t a business. I mean, because it could be, that might be your entire business information.

Monique Morrow 1:04:33
Well, then you’re going into something that’s tangentially another topic, but it’s data at the end of the day. We’re in a data economy, we’ve been in a data economy, we have to look at how we democratize data where people making money off of you, and so on, so forth. It depends on how you categorize information at the end of the day, but data is very, very valuable.

Monique Morrow 1:05:00
So the question is, do I have data about my competitors? Do I have data about future marketplaces? Do I have data, etc? How am I managing data of my system of suppliers? This gets into breaches, and then the loss of trust, etc. Then it gets into a subject about can we imagine something where we’re decentralized or hybrid, decentralized, moving forward in the future, rather than having everything centralized, that is data.

Nathaniel Schooler 1:05:27
Yes, and then how that data is stored, if it’s encrypted, what containers it’s in? I mean, there’s so much innovation going on in this arena, you really need a team of people to actually keep up just with the innovations of what’s going on in the community.

Monique Morrow 1:05:57
You also need to make sure that people, the consumer is aware of… It’s very interesting. I think there was some study that came out that says, 6% of consumers feel that they’re out of control, but 73 or 74% believe that they want to be in control. So, from the data, they don’t want to be notified.

Monique Morrow 1:06:03
They don’t want stuff happening them, they want to be notified by some loyalty program, or some credit program or whatever group that says, ‘We’re sorry, your data has been exposed and call this number’. They do not like that. I think that we need to be looking at how we hold it responsible. I think it’s very important to put that consumer the citizen back in that center of the universe.

Nathaniel Schooler 1:06:30
Very much so. Also connecting banks to that. If you have data, it doesn’t take a rocket scientist to work out… for example. I just ordered my food and thank God it arrived at quarter to because you were coming on the hour. I ordered this from Sainsbury’s. So they’ve got my credit card details. So let’s just have a scenario that a business has got details, someone’s card details, their address and everything else. That data is breached? The bank should be held responsible for ensuring that it is not used, and it should immediately be notified automatically. It’s a no-brainer.

Monique Morrow 1:07:28
I agree, Nathaniel. Where does accountability lie? Sometimes you get the hot potato effect. Now, while you didn’t read the fine line of your agreement, and so on, so forth. I agree.

Nathaniel Schooler 1:07:45
I’m excited. I think that we’ve got a lot of potential ideas. We’ve got a lot of opportunities. If you think about all these amazing products that are being created as we speak, someone’s probably got that idea of how that could actually work. They’ve probably already doing it, they might have done it already.

Monique Morrow 1:08:06
If you want to look at ownership of your data, and where that’s going… I’m going to be in CERN on March 11, and 12, March 12, is 30 years of the World Wide Web, celebrate. Berkeley is going to be there and he’s certainly concerned about the misuse of this technology. MIT is looking at something called Project Solid. Technology shouldn’t be used for surveillance, but in the end, you can become the purveyor and ownership of your data, and how you selectively you do that.

Monique Morrow 1:08:48
I think that is kind of the promise, I think, which is looking forward of how we become more responsible for our use rather than having dependency on central groups. I do agree with you on one thing with regard to banks and centralize unit institutions that have access to our data, whether they are phone companies or so on, they have to be held accountable and take responsibility by saying, ‘We’re sorry you lost trust in us, we’re going to give you a rebate’… or something to that effect. Right now, we still have arrogance of institutions.

Nathaniel Schooler 1:09:24
They’re going to be on their arses sooner, excuse my French.

Monique Morrow 1:09:31
I agree.

Nathaniel Schooler 1:09:32
They really are. It also goes to phone companies as well. I totally agree. I’m getting calls all the time from these numbers in London. I just keep blocking them because I search and see if it’s a spam number. They changed the number and then call you with a different number. So, and I should be able to just do one click, and it should be able to block that entire order organization.

I should not even have to answer the phone. I shouldn’t even have to do anything at all. That should go for emails, someone putting me on a list, some email that I don’t want in, it should just blacklist them immediately from sending me anything. Thankfully, I’ve got a really good web host, who’s a friend of mine. He’s been running a web host for 20 something years, he built me my computer just got a new computer. 32 gigabytes of RAM. And I’m just like, whoo hoo!

Nathaniel Schooler 1:10:30
You should be able to definitely be in control there. So on emails, I can go into the server, and I can actually turn up the severity of blocking email. So it’s like, 1 to 100. I’m at nine now. All I need to do is turn that up to 80 and then very few emails will even come through to me. I think I’m going to have to do that because I’m just getting a load of spam recently. Also I can block specific countries from emailing me as well and I love that feature, it’s just amazing to be able to that.

Monique Morrow 1:11:21
That’s really cool. We need more of that, because we have email leakage and there there are security implications to that. I totally agree.

Nathaniel Schooler 1:11:33
My brother, he’s been working for Cisco for years, and educated me a lot around security, but I can’t thank you enough for your time.

Monique Morrow 1:11:43
It’s been cool.

Nathaniel Schooler 1:11:45
Really cool. Really cool.

Monique Morrow 1:11:47
I’ve enjoyed the conversation and congratulations on what you’re doing with podcasting and getting the information out there. I think it’s so important the narratives out there, this has been a wonderful discussion and we could have even a follow up on several topics.

Nathaniel Schooler 1:12:06
I’d like that very much. Thanks so much for listening. Please subscribe and wherever you prefer, share with your friends. If you enjoyed the show, trap us a review on iTunes or wherever you listen.