In today's episode, I speak to Khyati Sundaram the CEO of Applied, a platform helping businesses to take bias out of their recruitment processes through anonymized recruiting, and algorithms that help them predict which candidates will be performing best in their roles, based on objective criteria rather than anything subjective and any biases.
The company also offers AI-based tools to help write job descriptions that are likely to attract a diverse set of candidates rather than biasing towards a certain set of candidates.
[00:00:00] Maiko: In today's episode, I speak to Khyati Sundaram, the CEO of Applied. A platform helping businesses to take bias out of their recruitment processes through anonymized recruiting and algorithms that help them predict which candidates will be performing best in their role based on objective criteria rather than anything subjective and any biases.
The company also offers AI based tools to help write job descriptions that are likely to attract a diverse set of candidates rather than biasing towards a certain sec of candidates and it's really good to have you on the show Khyati, thanks for making the time.
[00:00:36] Khyati: Thanks for having me here, Michael. Good to be here.
[00:00:38] Maiko: So let's start with your personal journey. You've actually been an entrepreneur for a bit. You've also worked with a range of different startups and tech companies. So let's start with you giving us an introduction to yourself and the biggest thing I'd like to understand is why do you do what you do now? What is it that drives you personally to do this?
[00:00:57] Khyati: Of course, so my journey is quite a bit of a jigsaw. It's very non-linear, I started as a trained economist, went, did the mainstream banking, realized that wasn't my cup of tea, started my first tech startup back in 2014 and I was using AI to make more sustainable supply chains.
And that didn't go according to plan and there were many, many, many lessons. This is my second time leading a startup, and I suppose the second time I've got luckier and know also coming from a place of much more personal despair in solving this problem. So I suppose I have more fire, which will tell you why I do what I do.
So, just to take a step back, this is 2018. I had shut my first startup down, I was looking for my next gig. Still wanted to carry forth the same themes of tech for good AI data, scalability. And I started applying for jobs in the same space. So I think AI, think C-Suite of startups in London, in Europe, and it was about eight months where I must have sent hundreds of CVs and cover letters.
Bear in mind, this is double the amount of work, which every candidate absolutely hates and detests. To send CVs and cover letters, and I was doing that and I barely received any responses. I must have got two responses in a span of eight months. And even though I'd run a company for three years before that, I'd done multiple other roles as a finance person, as an economist, which arguably teaches you a different set of skills.
And in my head, I had all these skills and I was pouring all that passion into every application, every cover letter, and I wasn't getting through. So there is this narrative that starts feeling that you've done something wrong and you haven't really been able to understand what the market needs and what kinds of jobs are out there for you.
So I started going down a spiral, which was really, really bad for me personally, and I was that point. I remember that I was completely at my ropes end, and I was trying to figure out what I should do next, and I paid people to write my CV. I paid people to write my cover letter. I found a professional head hunter, and I built up what I thought was a good relationship over a few months, but I was just being ghosted even by head hunters and this particular person, let's call him Bob, for the purposes of the story. I couldn't get hold of Bob for over three months again and after chasing him again and again and again, one day he comes over and says, I'm really sorry. By the way, your CV doesn't really fit anywhere, so I'm hesitant to put in front of employers. And that was absolutely the last draw for me. And bear in mind, my CV looks quite healthy.
I've had some privilege to go to good schools. I've worked in the biggest banks at Wall Street and the UK. I've started a company, arguably didn't go according to plan, but 99% of them don't. And I was just in a really, really bad place thinking that this is not what hiring is meant to be. Surely there is a job out there and an employer out there who can match their job to the skills I bring.
And when I dug into it, I realized that incentive systems are all broken, and that's where the biases come in play. So there are so many biases. There are about 180 biases according to academic research. That all come into play at different points in the hiring funnel. So think of it, I could give you some examples.
This head hunter would get three shots in front of an employer with my CV. So he needs three stellar CVs to make his paycut, right? He needs to give those three CVs to the employer. The employer should be super happy about the three CVs because they're all looking for that CV that fits the part, has done certain things, had fit the checkboxes.
And mine didn't do any of those because it was so non-linear. Nobody could put me in the box that people wanted to see. So I started getting questions like, You've actually run a company before. Are you sure you can report to the CEO now? And that's a bias of its own kind. I had other people saying, Oh, you haven't gone to these schools, but I only hire from these particular set of schools.
And that's affinity bias in play. You understand certain things, you familiarize yourself with certain things and that's what you want to do. So there's many such biases and I just started digging into it, and I suppose that's what got me really interested in how things are completely, massively broken, and that's why I'm here.
[00:05:22] Maiko: Got it. You already jumped in into the biases. What are the most common biases you see that companies have in their recruitment processes?
[00:05:31] Khyati: It's quite an expansive landscape, so if you start looking at just the front end, which is the screening process, the usual screening process will involve a CV or a LinkedIn profile screen.
And at that point you could already be talking about hundreds of biases in play. So I talk about something like affinity bias, where if you've gone to certain schools, maybe you've gone to the same school as I have, Michael, so I think you're better. And I'll call you for an interview. Maybe your name sounds familiar.
Maybe you're from the same country as I was, or the same village I was. I'll call you for an interview. So that's all affinity bias. There's another kind of common bias, which is a halo effect. Which is okay, you've been a developer at Google for six years, which automatically tells me you're a great person and you can do this job in my company.
Which is not true, entirely not true cuz you might have been able to work as a good developer in Google, but doesn't mean you will translate all those skills as a developer. It might start up on my 20 people company. And so those are all the skills that we need to test for, which are superseded when we have all these mental shortcuts that we are sorting to.
Another common bias in the later stage of the journey is stereotype bias. So as you start calling people in, we know we have this mindset of women are not great at STEM roles. I've seen that in hiring funnels where people will opt out and not call in women, or not call in certain sections of society for technical roles because they think those people won't be good at the job.
So it's all these kinds of different biases that start playing a role in human decision making. I'd make it really catastrophic for both the candidate and the employer. And these are biases that are hard to contain because it's all tribal in nature. In fact, we are all like that. If I line up a hundred people, 99 will say they're not biased, but 99 probably are biased and the one person we might be able to argue is not.
And it's just hard to train it out, which is why we need guardrail systems in place. To prevent this happening again and again and again.
[00:07:33] Maiko: And how does this look like? Let's actually dive into the solution that Applied has developed. How does the product work? What are the kind of tools that you give to employers to de-bias their recruitment?
[00:07:43] Khyati: So we sit as an end to end hiring platform in very simple terms. If you think of a status quo journey of a candidate, you would put in a CV or a cover letter. Then potentially you would be asked to do some tests or assessments, after which you would probably call for several rounds of interviews, and there might be a meet the boss interview as well where the top gun in the company needs to do vet you effectively.
So we've tried to flip that entire process and say at any given point in that process, we know there is chances of bias, and therefore applied can step in to mitigate those biases. So at the top of the funnel, we firstly, as you'd mentioned in my introduction, help you write really robust inclusive job descriptions.
So we at least open up the top of the funnel. And we've seen employers who use our job description tool have been able to open up the funnel by as much as 10% from women and ethnic minorities. And these are for all kinds of roles. So STEM roles, really difficult technical roles, really niche roles, we are opening up the funnel, so that's point number one.
We try to mitigate the bias at that place where people just go like, You know what? The pool is not big enough, or there aren't enough women who do coding. We're trying to combat that bias at that stage. At the second stage, which is screening, that's where usually you would resort to a CV or a LinkedIn screen, and we are saying that's fine, but if you are using a cv, how do you know the objectivity of it?
Because from my perspective, all you're testing for is how well a person can write, and that's what the CV is. And if that skill is required for the job, great. So if it's an editorial job, great, go ahead and test for how well the person can write in the CV and the cover letter. But what about the other skills required in the job?
If it's a sales job, what other skills and attitudes do you need to screen for? And that's the holistic skills based approach we are bringing at the screening stage. So you could think of it as giving you much more data, much more insights upfront in the funnel so that you can weed out on deserving candidates and really take the stars through.
That trick is being missed by most people. Most people are screening on CVs and then have to scroll through thousands of CVs most of them will go in the bin because they don't look the part. And we will miss out on deserving candidates because of all these biases. So that's the second stage, which is how do you screen and how do you replicate that piece of paper and make it more robust.
And then the third stage is we enabled you to do interviews as well through our platform. But again, we take a very different approach. It is all about structuring the interviews such that every candidate is treated the same. So if you're asking five questions, each candidate should be asked the same questions.
They should again, be skill based questions. They shouldn't be like a fireside chat where a boss goes and says, I'd like to hang out with you at 5:00 PM in the pub, versus, I don't think I can spend my plane ride with you. So those are the kinds of subjective things we are trying to take out of the hiring funnel.
And then throughout this process, what I would call the pumping heart of applied is the data analytics is throughout the process, we could point to you what's working, what's not working. You could iterate on the first round of hiring, improve your second route of hiring. And it's very granular. Its granular to the extent of if someone asked to really rubbish question in round two of interview three.
We would be able to point to that and say, You know what that question was asked, it was really rubish in terms of predictivity. It didn't have any skills, and by the way, it caused half of your women pipeline to drop out. And that's the kind of accountability that we are bringing to hiring.
[00:11:24] Maiko: That's amazing.
Some really good tools there. So one of the challenges that I see with like a very skilled, focused, evidence based approach to hiring is that every job has a range of different skills that are needed and experiences. If you are hiring a software engineer, you need something different than a sales person excetera. And first of all, understanding the set of skills required and then how to test for them in a non-biased way seems like a very individual challenge for each job role.
So I'm wondering how do you overcome that challenge with Applied? Do you help companies design those skills test, or do you just provide them for different types of roles, different types of skills required? How does it actually then work in detail?
[00:12:08] Khyati: You're absolutely right. It is a challenge to give super bespoke skill-based testing to each role, which is why not a lot of solutions have done that well in the past.
What we are doing differently is building a web of skills that can be connected to a job title or a job description, and that's why it's taking us so long. It takes years and years of data to understand what kinds of skills are the most predictive and the top skills you could ask for a certain job. And we've spent the last four years building that.
So one side of it is matching the skills to the job descriptions and their job titles and understanding really, are these five skills enough? If you tested for these five skills, can I robustly argue that this is the best person for the job? And in 85% of the cases, our algorithms say yes because we've mapped that over the last four or five years.
So that's one part of the puzzle. The second part actually is how you create the questions around those skills, even if you've been able to identify the skills. And that's what we are building now. So we're building a battery of tests and we build them in three different ways. One is we of course, build it in house.
So we have organizational psychologists, behavioral scientists in house who think of what skills would be required per job. And they do a full record of a certain job, be it a sales job or a product job. We use our own employees because we, of course, needless to say, use applied to hire employees internally.
So we use the intelligence of our own employees to understand what kind of questions would they like to see on a test. So that's where our own employees come into play. The second way we do that is we hire sometimes specialists and consultants to build our tests. And the third way is we of course, open up crowdsourcing from our customers.
So we've got about 300 customers now. We've had half a million applications through the platform. So that's millions of data points in terms of questions that can be mapped to skills, and we are constantly looking at it, cleaning it up, iterating and understanding from, let's say you wanted to hire a sales role through our platform.
We've seen 25 skills tested for sales role over the last four years. Which five are the most predictive, and that's when a lot of data science and analytics comes in to understand. In the future, if someone were to hire the sales role, can I just predict these top five skills and they should test for that. And so it's a lot about working and iterating and understanding how the market moves, how the workforce is moving, but also what skills are needed in every organization, which is why we think of applied as an decision intelligence tool or an intelligence framework that will evolve as your organization and what evolves. And it only gets better the more data you feed into it.
[00:14:56] Maiko: So ,you've actually built productive algorithms to help employers show which are the candidates that are most likely to succeed in these roles. That's one of the features, isn't it?
[00:15:06] Khyati: Yeah. So we are building it, but to clarify to your audiences, we haven't released full scale machine learning yet because I'm not fully confident that the machine learning today that we could create would be fully devoiding biases.
So we don't have machine learning to say whether you get a job or not. So that's still decided by humans. Where we have some algorithms and machine learning is doing all the data analysis. So thinking, okay, you want to hire a salesperson, we can predict you these five skills. You still have to go and then write questions, or you can take inspiration from a battery of questions and then the process.
So the process uses a mix of what I like to say, human intelligence and automated intelligence.
Got it.
[00:15:46] Maiko: You're bringing up a really important point. It's ridiculously difficult to design AI that is actually not biased. How are you approaching that? Because that's like obviously one of the biggest AI and data ethics questions out there and like, okay, how do we make sure we're not bringing another type of bias into the data sets or into the algorithm?
[00:16:06] Khyati: That's the million dollar question, Michael. I think I've seen so many implementations of AI, where majority of them are adding bias, right? So if you think of any keyword search, most CVs are now going through a keyboard search tool. These keyword search tools are filtering out, and I know this for a fact, sometimes they're filtering out on types of names or ethnic.
For example, I know this company that only hires Indians because they think their work ethic is great. And don't get me wrong, I'm of Indian origin, but it's still a bias in its own, and it's weeding out deserving candidates. So we know that you can implement AI to add bias to the system, and 95% of products out there are doing that.
So my baseline is harm reduction. I don't want to add bias, but I want to reduce the harm that is being done by all of the AI systems out in the world, and we have enough and more evidence of this happening with the Amazons of the world and the LinkedIns of the world where they had to retract their AI bots in recruitment.
So for us, it is twofold. One, really understanding what models we are building and how we are building those models. And the second piece, which is even more important, is who is building those models? So I have a very representative, highly diverse team. We have men, women, non-binary people, we have various ethnicities.
They all come and work together and they can bring different perspectives to the problem. And that's what I think is the missing piece. Cuz if you had a team that was really empathetic and really understood the problem from all perspectives, they would surely not release AI that was adding to bias in the marketplace.
[00:17:41] Maiko: Got it. Let's move actually towards some practical advice. We have a lot of early stage founders listening to the podcast. I've worked with a bunch of different startups across the years as a consultant, as an investor, as a founder, myself, and I've barely ever seen. Any early stage startup that has done de-biased recruiting well?
Not necessarily because the founders were bad people, or they're like, Oh, all this diversity stuff is not important to me. It was important to them, but in reality, they didn't actually prioritize it maybe. Yeah, because of the short term pressure. You're like a small team of one or two people. What should I even be doing? Right?
So is there any advice that you can share with early stage small companies that they can start doing to make progress in a lean way or establish proper diversity standards without making this super scary for them to say, okay it is like the only thing I should be focused on?
[00:18:36] Khyati: Completely. I will give you my example from my previous startup when I was hiring, and that's where I made some of the biggest mistake.
What every founder forgets, or what every company on this planet forgets is effectively their recruitment company. Cuz you are only as good as your team. I think that is something that is not taught in any classroom. That is something that some people are very lucky to understand early in their careers.
But a lot of people don't get that. And what happens is you quite often think that you know how best to hire, but a lot of the startup don't know how to hire. And the question is, one, where did they start? The second question is, once they start, do they know they're doing it right or wrong? So answering that first part of the question will add some color to the second part.
I think it's important to know where to start. Most people will go back to their schools or their networks to hire, and that's what I did. That's exactly what I did in my first startup. My entire team was from my business school. And well i mean, now I have to talk to you about my first audience experience, but it wasn't great because we were hiring people with similar perspectives, similar backgrounds, and it's not conducive to building the most robust, scalable business at that stage.
That understanding needs to come through. Not everybody is at that stage to have that understanding, and that's fine, but that's the second part of the question. That's where the solutions like Applied, can come in. Now for startups, I think applied is a bit expensive. And that's the honest truth of it is if you are two people and you haven't raised any funding, you're not going to spend money on a technology solution where you think it's a nice to have if it's diversity, but if it's, even if it's data driven, robust evidence based hiring, you would still not need it until maybe 10 hires are in place.
As a founder, that's what you would think. Now we work with founders who have started early or started late, and people are on their own journey. And as a ceo I'm happy to help them wherever they are in the journey and meet them. But the biggest advice I can tell them is, one, a realization or a conversation at least of where to start.
And not just start in your networks or your business schools or your universities, because that's not going to be the best mechanism to build a robust team. And the second thing is to think of what possible low tech, cheap or free solutions you can build in place. So I know for a fact that there are hundreds of free job description tools online that will help you write better job descriptions.
Fine, they might not be as good as ours because we charge for it, but you could still start there. So there's lots of resources now, it's not 2010 where you didn't have resources online. There's quite a bit online and there's lots of help around and that's free, but you could use it. Even if you didn't want to use software,
there's lots of things you could do. For example, just have a systematized spreadsheet and your systematized spreadsheet could have five questions. And it's the same questions that you ask of every candidate. So every candidate gets the same experience and you're starting to mitigate a little bit of the subjectivity throughout.
So it's this tiny little things that you could do. But I think it comes from the premise that every company is a recruitment company and they, until they realize that it's going to be a very difficult challenge for most people.
[00:21:45] Maiko: Got it. Is there any advice or tips you can share specifically on job descriptions and job ads?
I've seen people speak about making the salary transparent, which later on actually leads to potential bias on the salary negotiation if you don't include it because you have different groups negotiating differently and leading to loss of equity there. I've always been frustrated by job descriptions that say at least five years experience.
Or graduated from a top tier university or things like that. I assume this may be like some of the things to avoid potentially. I dunno. But I'd love to ask you, what are kind of the top things that you should definitely not put into your job description or things that you should add to it to recruit more diverse talent?
[00:22:31] Khyati: So the first thing, as you said, salaries, we advocate, you put in salaries, and that's because it makes it easier in the later part post hire as well. So at Applied, we are big champions of making very transparent progression frameworks that map out the different job titles and salaries. So if you've started that journey already in the job description, it is much easier to onboard people and then progress them throughout the organization if it's all fairly transparent.
So that's number one. Number two, as I said previously, we advocate people that they make sure that jobs are written in a very inclusive way. If you think of the wording or the phrasings of technology jobs in particular, it was developed back in 1980s or seventies or whenever, and the world was much more homogenous.
So if you start using words like Ninja and, I don't know, Rockstar and all of these kinds of heavy, masculine, corded words, We've seen that it drops out women, it drops out ethnic minorities because they don't associate with those mental images. And it's less about what words have been encoded in the job description per se.
It's also more about what society has created as a construct for these certain minorities and women. So it is more about understanding how to mitigate for those words and phrases, making it more inclusive. And this is where I say there's lots of free online help. And the third thing is, what I would like to point out is we distinguish between expertise and experience ,at Applied..
So in my example, for me, particular, I worked in banking for six years. Let's take that. I could have been a completely rubbish banker for six years, but if someone just looks at my experience, they'd be like, Oh, okay. JP Morgan , RBS six years great. No, it's not. You don't know anything about my skills as a banker.
So that's what we are saying. We are discriminating between expertise and how many years you've worked. So we ask you to remove your work experience, number of years where you've worked. All of these things are biases that should not have a relevance on whether you can do the job. So it's about finding, in my example, that I gave the best banker or the best salesperson, or the best product manager.
And that is quite often decoupled from how long you worked as a product manager or where you worked as a product manager. And those are the things that we have to be conscious about in writing in a job description.
[00:24:51] Maiko: Got it. What do you think I actually, the biggest hurdles for companies adopting this at scale? I think as much as there has been a much bigger focus on diversity, equity, and inclusion, and also diverse hiring practices in the last few years.
I still think we're like at the very beginning, I'm not actively looking for jobs, but occasionally I browse like jobs on LinkedIn and things like that, and I'm just thinking like, how are you still getting away with these things with some job specs? So what do you think is the biggest barrier for companies not adopting this?
[00:25:21] Khyati: It's twofold. For me, the biggest challenge, what we see as is selling applied as a product and customers or prospects adopting it as a solution. It is the status quo. It is that comfort blanket of we've done this for 40 odd years, we know how to do it. And it doesn't matter if some things or some decisions go wrong, because by the way, there is no ROI and it's, that's the second piece.
Where is the ROI? Who's holding you accountable for incorrect decisions in HR? By the time you know that the person is not going to make the cut, that's already 6 or 12 months later, they're already in. And so the person who's taking a decision about whether they can progress or whether they're a bad hire?
Is there person removed from the person who did the screening and probably got them into the organization? And that's the challenge. The status quo is too heavy and there is no ROI and at Applied, we're trying to beat both of that, saying Let's educate you. For me, a lot about Applied is not just giving you a solution, but educating the market.
There is a problem and the problem is data score. We just need to have a leap of faith and move away from that. And the second is we are giving you the ROI, so if you're worried here, this is a silver platter on which I will serve you all the data, you can make an informed decision and know which bits work and which bits don't.
And both of those things I think are a challenge, especially if you're a bigger company. There is a lot of inertia. We just would not be able to sell to bigger company with that kind of solution. So we have to then understand what the pragmatics of self selling at scale look like, which is what we are trying to work out right now as a company.
[00:26:58] Maiko: Got it. How does that change happen in bigger companies, which is kind of the key focus area. Obviously it's not the one or two person startup for you that you are selling to mainly. It's really the more scaled companies. And advantages they may have the resources to invest and all that, but also they're potentially more slow moving or challenging to change.
Again, as you said, the hiring practices from years and years. How do you go about this? I assume you would be talking to maybe some diversity managers. You would talk to human resources. But then you also have the hiring managers in the organization. How does that process actually happen of Applied being used in the organization and them actually transforming whatever they have done for many decades in many cases. Right?
[00:27:41] Khyati: Yeah, so I think it's about finding your sweet spot. We have taken a our time to find a sweet spot, which is roughly around hundred to 300 person organisation. So as you said, anything that's like five person organization is probably not worth their time in terms of time and money. And if it's a 5,000 person organization, you can't change them overnight.
So it's probably a 12 to 24 month process, which us as a small start of ourself, we are 30 people. We can't afford that level of hand holding. So it's been a lot about testing and understanding what our sweet spot looks like. So within that, the 100 to 300 person organization, it's not as simple as we'd like to be, but we've learned and adapted, which is a great thing.
So what we would do normally is our buyers would be a chief people officer or head of talent. It's rarely a diversity person or a diversity manager. I personally have a gripe with that title and that role in many organizations. Therefore, we try and hit the decision makers. Quite often, the diversity manager is not the decision maker, and that's the unfortunate reality of most organizations.
So the decision makers for us would be, if it's a smaller company, it could be the chief operating officer. If it's a slightly larger company, it could be the head of talent. And what we do a lot in the beginning is the education piece of bringing together the head of talent or the CEO or with the hiring manager.
So it is democratizing hiring in that way. We're building a funnel and a software platform where all of these people can come together and have a conversation about the hiring. Whereas in other companies, we see it's quite isolated. You have an HR function or the talent function who will bring in the talent, and then the hiring managers take them off and do whatever they want with it.
That's not what we want. We are elevating the HR function in our organization. And we wanted to work in concert with the hiring managers, whether the hiring manager is a product manager or an engineering manager or a sales manager. And that's what we do. That takes a lot of education, of course. So we have lots of content.
We have to do lots of hand holding in terms of webinars. We do onboarding sessions where you bring together different customers and we sit with them and ask about their problems and how we can meet them and work with them on the solution. So it does take a little bit of time and we've seen that once we've got that champion.
That scales really well because once we've got either ahead of talent or one particular hiring manager who has loved the product, seen their first hire through the product, most often that hire, they come back to us and say, they would've never made that hire if they were looking at a CV. And I'm very confident about the product.
So we know that in 95% of the cases, The person I'm giving you at the top of that screening list is the person who will try and get promoted in your job, and therefore we know that it's the right person for you. But they take of course, a little bit of time because they have to work with the person.
They have to spend some time and understand how the person fits in the team. But every time, this is for every, every hire we've done, we've done now 13,000 hires in the platform. People come back to us and say, I would not have hired this person if it would not for Applied, and that's the change. That is the value creation or the light bulb moment you see for the customers.
Once that's happened, then it's easy peasy.
[00:30:57] Maiko: Wow. It's really great to hear that you've made such advancements already. I'm sure if you're in the seat of being the CEO and in the team, there's always like a million more things to do, but it's great to hear the progress already. I've got one more question for you, and that's about the future.
If you think. 10 years ahead from now, how does the world look like if applied as successful year? If you continue to be successful? I have to say, how does the ideal work world look like in 10 years?
[00:31:24] Khyati: That's a very exciting question, but if you think of it, at its core applied is not just giving people jobs.
Applied is giving people the possibility of a life that wouldn't otherwise be possible. So for me, how can we scale that? The first thing we can scale that is of course just getting more people, getting more employers to use the product and making this a scalable solution. If we do that, we can see Applied as an expression of a more inclusive society, a fairer society where there's level access to jobs and in the future.
So we're talking 10 years here, 15 years here. I would like to scale the product, and I've called it a decision intelligence mechanism. This decision intelligence mechanism can be scaled, not just jobs, but how people get allocated funds, how people get into schools, how people get into universities. How do you even get your mortgage.
All of those people decisions today we know are completely biased and I would like to see Applied becoming a beacon of an inclusive, fairer world by having a solution for all of those problems. But that's like 20 years out.
[00:32:30] Maiko: Okay. Can't wait for that. Well check and then maybe in 10 or 20 or maybe even before it.
thank you so much for coming in. It's amazing to see what you've done already and wishing all the best and the team for the next few years. So thanks Khyati for joining me.
Thanks, Michael. It's a wonderful conversation.