Want to hear more from us?

Subscribe

Why Learning Agility is a key capability for candidates in the AI era | With Fiadhna McEvoy

Thursday 13th November

Why Learning Agility is a key capability for candidates in the AI era | With Fiadhna McEvoy

Why learning agility is a key capability for candidates in the AI era

AI is forcing a rethink of what we hire for — and how we measure it. If many hard skills now have a short shelf life, the standout capability for candidates is learning agility: the ability to pick up cues, adapt quickly and stay effective as roles evolve. For TA teams, that raises a practical question: what’s the earliest, fairest signal of learning agility at the top of the funnel?

Meet Fiadhna McEvoy C.Psychol, MCIPD, Psychometric Innovation Director at Arctic Shores — a business psychologist with nearly 20 years’ experience across occupational assessment, organisational development and psychometric innovation. In this episode, she joins host Robert Newry to lift the lid on a brand-new, task-based way to measure learning agility — and why it matters for every level, not just leaders.

With AI reshaping work, Fiadhna shares a pragmatic route through the chaos: focus on primary behavioural evidence, keep humans in the high‑stakes decisions, and adopt new signals responsibly as the data set scales.

Join Fiadhna and host Robert Newry as they unpack…

💡 Learning agility, properly defined. Beyond buzzwords: how learning agility blends implicit learning from real‑world cues, social dynamics, and comfort with uncertainty — and why it’s relevant from entry‑level to exec.

⏱️ Why this matters now. In a world where technical skills can “expire” in around two and a half years, analytical thinking remains paramount — alongside resilience and influence — as roles, tools and team dynamics keep shifting.

🧪 Show, don’t tell: introducing a new task-based assessment for learning agility. Traditional, text‑based measures (self‑report, SJTs, even some ability tests) are increasingly vulnerable to AI assistance. A task‑based approach captures thousands of micro‑decisions to generate explainable, AI‑resilient evidence at the front of the funnel.

🔍 Validated, not a black box. From theory to prototype to iterative testing, Fiadhna’s team focused on construct validity and real‑world performance — culminating in a BPS Division of Occupational Psychology presentation and a forthcoming paper.

🚀 Early access in high‑stakes hiring. What TA teams are seeing so far: a trading firm combining learning agility with quantitative reasoning to shortlist at speed, and a global accounting firm applying it across Early Careers streams — with consistency from top‑of‑funnel through to assessment centre.

🧭 Rethinking proxies. Why educational attainment and CV heuristics are weak predictors of workplace performance — and how primary evidence offers a fairer read of potential.

This episode is packed with clear thinking, grounded science and practical takeaways for any TA leader navigating AI‑enabled hiring. If you’re exploring how to identify and select for learning agility, not just past experience, this one’s for you.

Listen now 👇


Transcript:

Robert: Welcome to this special edition of the TA Disruptors podcast.  I'm Robert Newry, co-founder and chief explorer at Arctic Shores, the task-based psychometric assessment company  that uncovers potential and sees more in people. In this episode, we will be taking a look at a major new development, the release of the first task-based measure of the much talked about workplace capability learning agility.

And to tell us about this groundbreaking development we have a leading business psychologist, Fiadhna McEvoy, Director of Psychometric Innovation at Arctic Shores. Fina is a psychology professional  with nearly 20 years experience spanning occupational assessment, organisational development and psychometric innovation. A psychometric innovation director at Arctic Shores

Fiadhna leads the design and delivery of its cutting-edge task-based assessments. Prior to this,  she headed up the assessment design at People Scout and held senior roles  at TMP Worldwide, Cardiff Council and CAPTA Resourcing. Welcome to the podcast, Fiadhna. 

Fiadhna: Thank you very much for having me, Robert. It's really exciting to be here. 

Robert: Well, we've got something exciting to be talking about today too. Let's start with what is learning agility? I think it was Forbes article  that said, learning agility is working out what to do when you don't know what to do. So what's your definition of learning agility? And is this any different from some of the past definitions?

Fiadhna: So in Arctic Shores, our definition of learning agility is the ability to understand and learn from one's environment by understanding the relationships between events and outcomes and using that information to generate and integrate information. So it kind of links to and will give an indication of someone's ability to, as you'd expect, learn from those explicit and implicit pieces of information in the environment that they're working within. It is also linked to someone's ability to  understand social dynamics and understand patterns in how people in a social situation are actually operating, which obviously then links to  people's ability to kind of collaborate and work in teams. 

And then there's also an element of being able to navigate uncertainty  and have an anticipate and understand how situations are going to play out as they unfold, of  using the information in their environment. So when we kind of think about the term learning agility, I'm sure you remember when it became hit the kind of HR world.

Robert: It has, exactly. It's been something that's been talked about on and off over the years.

Fiadhna: It was back in 2000 that Lombardo and Eichner, actually really coined the term and it landed so well with the market. I think you couldn't really turn around for there being a high potential leadership programme with learning agility at its core. So it really resonated with the market. And when you kind of go back and you look at the actual academic literature around it… It's very interesting because there are  slightly nuanced differences out there, all hinged around the concept of learning and adaptability.

But there are kind of slight little nuanced differences, and another quite interesting point is like at the very start of when learning agility came to, you know, being, there was a real emphasis on it being for leaders and at that more senior level.

Robert: Right. Rather than something that people would have innate.

Fiadhna: Yeah, but now over more recent research is broadened out. It's kind of applicability really. And it's seen as more equally as important for any role, any role, which kind of makes sense, you know, learning is not it should not be just section for that kind of more senior level.

Robert: Sure. Yeah. So if I understand it correctly, we there are three elements in the definition that you alluded to there. So, one is your ability to acquire  knowledge  and  use that to then adapt your behaviours and take advantage of what you've learned. Another is social interactions. That's interesting. I hadn't thought about it in that way that  clearly as humans, our learning agility is a social aspect as much as it is just a cognitive one.

And then the last is just comfort with uncertainty, which is also incredibly interesting because we live in uncertain times. And I know from my own discussions with  many organisations that  bringing people into the organisation, especially early on in their careers, they're going to have to deal with a great deal of uncertainty and have to  quickly adapt and learn from it.

So I think  that's kind of interesting having those three elements to it rather than just  a simple single aspect and element to it.  just, I think, help bring a little bit more to that definition. Is learning agility more than just a cognitive ability then? Is it more than just...  Because people think about learning agility maybe… just from a lay person's point of view as your ability to learn. And that's just something that we do at school  and  isn't that just a mechanical activity that you do. But clearly it's more than that. So can you bring maybe a little bit of that to life?

Fiadhna: I suppose it's really reflected in the definition in the sense that it is quite broad.  We are constantly learning all the time, in school environment, it's a very explicit type of learning where you're sitting in front of a classroom, whereas every day we're in a kind of situation where we're kind of learning, picking up signals, understanding in new experiences all the time.

Robert: So more implicit then, right?

Fiadhna: Exactly, it's kind of like being drawn out as well as the more explicit learning. So you're right, the kind of common association with learning is that are those more explicit, know, a learning development programme or? I do training course, they would be boxed into that explicit, but there's more implicit activity that's happening all the time, all around you, that you are unconscious of. Yeah. And that's what we're kind of tapping into when we're thinking about that definition.

Robert: And I really like that because  many people will quite rightly think about in their own situation that we learn in different ways in this explicit way. And some people more visual, uh some people more verbal, some people  more written, you we all have a way on that explicit that we like to learn. And I suppose what you've alluded to there, what we're trying to tap into is not the way  that you learn, which is  the more explicit way that we think about learning. It's much more your internal way that you absorb information.

However, that has come about for you and then use that in the three ways  that we talked about.  So, okay, uh that definition, there's some nuances to it from when it all became the big thing back in 2000 and subsequently since. What you've been talking about there is you've been building on a definition that has been well understood and researched around that. So, you you've explained some of the nuances around that. So there's I suppose, different in terms of our interpretation or your interpretation of learning agility around this. It's just being able to explain this is what it is, these are the elements to it.

And so why do you think now that as we come into the age of AI is generative AI, particularly because AI has been around for a while,  that  learning agility  has come back to the top of organisations interests again, because there is just so much disruption that's caused by AI and it's going to change the way that knowledge roles are going to adapt and grow for individuals as they come into an organisation.

Fiadhna: Like I think with the backdrop of AI, there's a lot more of an emphasis on those softer skills coming to the fore as a priority for employers. And like with the rapid technological change, the it's said that now skills have a shelf life of two and a half years which those more technical, which is, you know, if employers are wanting to recruit in a sustainable way, that is meaning  that their organisations will be future proofed.

I think we need to be thinking a more broadly than those more harder skills, which then puts into play and increases the emphasis  really on soft skills, one of which would encompass that learning agility piece. And again, thinking about the rapid technological change the workplace is changing at pace so much more than it was in the past. Just that core requirement to be presented with a new technology, new tools, new ways of doing things, be able to absorb it, integrate it and then use it within your day to day job. That is becoming more and more important and will end up acting as a bit of a differentiator between companies and organisations that are really able to embrace AI and kind of use it to the best effect.

And those that haven't been able to kind of embrace that. if you're kind of cultivating  a pool of  employees who are demonstrating that propensity for learning agility, that combined with the technology, it's a really nice, powerful combo to really maximise the times that we're in.

Robert: I think that's really interesting. And I had read a piece the other day  that we used to  celebrate  and value  someone's ability and what they called codified knowledge. was knowledge that we had learnt and whether that be through wrote learning or just spending time studying something. And very much the way that we developed people and what we expected from somebody in a role was as much their codified knowledge as it was their, what they called tacit knowledge, which was very much how they had used their experience and their individual perspective of the world alongside or on top of the knowledge that they'd acquired. 

And so when you think about it that way and the way that you alluded to it is that  with generative AI now, and you can see it in  roles, most obviously in things like law,  where you need to teach somebody a great deal of knowledge about every legal case that's out there, because generative AI will be able to research it and summarise it. It'll be very much then how do you apply it? And that'll come about from the tacit knowledge and that will uh very much be linked to somebody's learning agility is not just how do I acquire knowledge, it's how I use that knowledge to add value from my experience and from the way that I absorb more than just the codified knowledge. 

And so that all kind of makes sense,  but… Is that something that you and your team just observed or is this something that the leading consultancies out there, has there been any other evidence  other than hearsay on this?

Fiadhna: Based on what you said there, there's one point I'd like to just draw out. Yes,  AI can massively help in terms of that core knowledge base, but it is also  that kind of critical thinking information accurate. So we cannot 100% be taken on board. So I think that's what's been really interesting actually, because when you look at the research, and particularly the World Economics Forum did, they regularly do fantastic reports, but one of the future jobs reports, they did one in 2023 and one in 2025. And I was quite curious to understand.

Robert: What had changed.

Fiadhna: Exactly, what things gone up, had things gone down. Lo and behold, the top spot on both years was analytical thinking. So despite AI being having...

Robert: still remains.

Fiadhna: It still absolutely remains. In fact, it probably will become increasingly important because there is so much more information and noise out there that people need to be able to effectively evaluate the noise from the actual core information. So that was, I think, a really interesting takeaway that analytical thinking retains its absolute importance. Then… Some other kind of interesting changes, the leadership and social influence increased in importance this year. So when you kind of put that into the context of so much change happening in organisations, AI not having a focus on interpersonal elements, you can kind of see why that focus on the  influence being able to bring people behind an idea, being able to created a level of followership and then influence people towards...

Robert: There will always be a fundamental human trait and value.

Fiadhna: But that has increased in importance, as has resilience and flexibility and agility. Which again, if you think about the context, on an individual basis, there's so much more change happening. Individuals in the workplace need to be able to absorb this change more and maintain a level of work output whilst all this change is happening around them, and also integrate information and new newness into their roles. And again, there might be have job design changes as a result of AI. So there's so many different components of an individual's experience now in work that has become a lot more complex. That kind of resilience piece really comes into play as well as then the learning agility piece as well. 

Robert: Right. It's a good point, actually, to Learning agility is clearly going to be important as we move into, you know, world of greater change and uncertainty.  But it's not the only thing that's going to be important. It's one of a number of things  that has to be put into that context. But it is something that we've talked about  and  then kind of disappeared a bit over the years. Now has come back to the forefront again as change has accelerated and kicked in because of AI.

So,  okay, we understand now that what learning agility is, why it's come back to the forefront again. And you've, like me, been around the block for a bit and seen that sort of coming and going around this and how we've uh assessed it in the past. So why did we need something new to bring out to measure learning agility, because if it's something that we've known about for a while and people have been assessing it, then  you would just assume that people just dust off what they had before and bring it out again. But they clearly are, the world has changed a bit and some limits, limitations and challenges to the old way of learning or measuring learning agility. So can you share? you know, some of the challenges and risks around that and why a new method therefore was needed.

Fiadhna: Yes, yeah. It's a great question. So if we kind of think back to some of the more original measures of learning agility, they tend to be more traditional in nature, so more based around self-report. So an individual giving a read of how they see themselves in response to questions linked to learning agility.

But now we are in a situation where we have the likes of ChatGPT that can be used very easily to help shape how a candidate responds to those particular  questionnaire based type items of the research, the academic research in occupational psychology world is out there now demonstrating how the likes of cognitive ability tests that have also been used as a proxy for assessing and measuring learning, they are vulnerable to the likes of AI candidates using AI to help complete them. 

Similarly, with situational judgment tests, again, which can tap into that construct of learning agility can be helped. But essentially, candidates can be helped through use of chatGPT.  So we're in a position where the more traditional approaches of assessing learning agility are more compromised. So we need to be thinking about things in a slightly different way.

Robert: Yes. So you've got an issue around self-report.And then you've got  the vulnerability of the traditional ways by asking questions on that. Let's just go to self-report. I always find  this  fascinating that, know,  how  would you know  how  whatever questions you might be asked, how good you are at learning agility. It feels like something to me  like resilience that you almost have to put somebody through their paces  to understand  how they really compare because I'm,  you know,  I'm sure like many people, if you  asked me how quickly I learn or whatever the questions were around that and you can put the nuances and phrases in there.

I'm going to compare it to my own carefully selected group of people that I know.  depending on how well you've  selected that group,  you  will just have an impression that, oh yeah, and I'm great at learning things. But actually, when you compare it to the broader group, which  most of us have never done, I could be nowhere near as good as I thought I was.

Fiadhna: Yeah, absolutely. And that's the interview based approach  is and has been very much used as well to  explore and get a read on somebody's learning agility. But as you say, it's a secondary data point that you're collecting in that type of situation. I'm just thinking back to  my earlier days when I was very much involved in designing assessment center exercises.  And I remember one particular situation where we were signing a role play exercise. It was for a call center role, snd the team, we approached it in a really clever way, actually. So there was one scenario where the candidate interacted with the role player to deal with a problem issue for the coal handler. And then the assessor in the room gave feedback to the candidate. And then they ran the role play again with a slightly different scenario, but tapping into those same constructs. So in that situation, you are getting a primary source of data. You're literally observing it in the moment. But obviously that's at the assessment center stage.

Robert: Which is not scalable.

Fiadhna: And not scalable, you're investing a lot of time and resource at that stage. So that's an ideal way of assessing. But if you want to be in a position where you're assessing it at the front end, we do need to be thinking about things in a different way. So the way that we've kind of tackled it, I'm sure many of your listeners know that we operate using tasks.

based approach. what that means is we use tasks that a candidate interacts with. So they we are understanding and getting a sense of how they're responding to the tasks  and as they move into the task they're making tiny little decisions as they go constantly.  Now if you take a decision in isolation it's not really telling us all that much.  But when you're looking at all those small little decisions compiled over the whole exercise, you start to see patterns in the way that they are interacting with the exercise.  And from that information, we're able to kind of gain really valuable insights in terms of how a candidate learn in work place, how they might  demonstrate their determination, how they might like focus their attention. So  through that  primary source of evidence where we're actually  getting a candidate to interact with the task, we're able to create a sense and get a real kind of sense of measurement of how that candidate will be likely to operate in the workplace.

Robert: Brilliant - yeah. And I get that the ideal way to really understand somebody's capabilities in some of these traits is to just to go for the primary evidence as being

Fiadhna: the ideal to have somebody in the role for a few months.

Robert: Yes, that's right. And so, you know, we have to have something that's scalable and repeatable and consistent.

We understand that there are challenges around self-report and the vulnerability with tools now to go and take language-based assessments on behalf of candidates and  ace them. Then in the past, we also had proxies such as educational attainment, whether that be GCSEs, A levels, probably more A levels, and then degree results.

Why can't we rely on educational attainment as a proxy for learning agility, either in the way that we used to, or perhaps it was never a very good proxy in the first place?

Fiadhna: Well, I think  you mentioned like CVs and attainment. think, you know, the research mission hunters, classic paper, classic meta-analysis that shows that actually they're quite weak predictors of performance in the workplace.

But when you kind of dig into attainment and educational attainment, and think about the context in which that education happens, there are so many other  extraneous factors that can play in to somebody's ability to achieve in that very particular context. So in terms of like the biases that can creep in as a result of the tool that you're using,  you need to be really, careful about that. So having more of an objective read of learning and an individual's ability to learn really is a fairer way of approaching it.

Robert: Yes, you have to take it out of that context to some extent. Absolutely. What school you went to, what teacher you had. What was happening at home? What was happening with the time around that? So many things. Of course, of course, of course. And I think then, you know, that's probably one of the most important aspects around this is that  if we don't have a good proxy  and  we can't rely  on the way that we use to measure it, we needed something different around that. 

So we've established then the importance  of learning agility that the traditional ways and the proxies we use to rely on to help us get an indicator of somebody's propensity around learning agility are  no longer as good as perhaps they were.  And so  we had to design something entirely new, going for a sort of primary approach, evidence approach to this. So how did you go about that? Because this is,  breaking new  ground, it seems to me, on this trying to come up with a task that taps into learning agility and  you've got to be able to come up with something that is better than what we had before. So  I imagine that must have taken quite a lot of work and research. Can you share some of that?

Fiadhna: Well, I think the first thing to say is that  we are incredibly lucky at Arctic Shores to have such a fantastic group of psychometric experts within the team, so we have the people that have been involved and actually bring in the task to the stage that has been out really exceptional.  But it has been in the pipeline for a while. And m our psychometrician, Dr. Luke Montiori, in fact has been the concept of learning and implicit learning has been very much part of his professional journey  over a number of years and has a particular interest in that area.

So to the point where he's kind of contributed to the research literature out there and the published papers on it. We essentially, as we do with all of our task development, we start from the theory. We start from the body literature that supports and acts as a foundation for the psychological construct that we want to measure, which in this case was learning. And from that kind of initial discovery stage, we identified what types of tasks exist out there already in the  scientific  literature in the community that we might be able to adapt for a recruitment context.

Robert: Because we're moving into the world a bit more of cognitive neuroscience  here rather than just the sort of psychometric  or psychology construct that  there's been the familiar and traditional approach to this.

Fiadhna: Yeah, so we were kind of tapping into the cognitive psychology literature, um individual differences literature, pulling that all together and exploring all of that  and  identifying kind of key tasks that would be relevant. And particularly, we were particularly curious around the concept that we spoke about earlier and implicit learning and how that could be integrated in a  clever way into tasks. So that kind of provided a bit of  a guiding light as opposed to the approach that we wanted to take.

So when we were in the position where we had a couple of tasks, we were engaging with UX and our developers to essentially build prototypes of these tasks, which then allowed us to go into  the really key part of the process, which is testing. So that testing cycle was a really, really interesting one. So basically we're kind of using the prototypes, getting participants to complete the task, understanding how they're approaching the task, questioning whether the task is measuring what we want it to measure.

Robert: That's right, that must be one of the hardest bits to go from theory  to practice  around all of this because there's a big bridge to cross between the theory, what you do in a lab, to  suddenly pushing it out there and people on different devices and… you can't control even their understanding of, and you're probably still learning  how they interpret the way that you're setting up the task.

Fiadhna: Absolutely. So it is that kind of iterative data collection process, feeding that information, those insights that we've gleaned from the data back into the task prototype, iterating again, testing again, getting you data through. But that's really important because we're not going to be confident bringing a task to market if we don't know it's… we need to know is doing what it's meant to do and doing it in a reliable way. So that's absolutely critical for us.

Robert: And just on that first bit, because  as you lead to, there are a number of different things  to get something like this ready before you take it out to market. But the first  piece around this is how do you know it's measuring what you wanted and expected? Because the theory is controlled the when you're putting out there, you have got less control around that. You're getting different data signals in there. And so it's having to work through the noise of the data signals there  to make sure that you're capturing the data that relates  to what the theory said you should be  capturing.

So could you share with us how do you do that? How do you know at the end of the day, what was it that enabled the team to be able to confidently say, right, we know from this data now, this is capturing what the theory said. We should be capturing.

Fiadhna: It was a combination of work between our applied psychologist, cognitive scientist, psychometrician around really digging into that data and understanding what the construct validity looked like for the information that

Robert: So construct validity is the term where this learning utility is a construct. It is a term that is well understood. And so we had to be able to show that what was being captured by this new task did actually measure that construct.

Fiadhna: Exactly. And we've actually got a paper coming out, hopefully in the new year which kind of digs into this, way that we approach learning agility and the methodology that we took to building this new task, because the approach we took was slightly different and really did m kind of push boundaries from a psychometric perspective in a very innovative way. So we presented the BPS DOP conference around this because it is a bit novel and different.

Robert: So the BPS being the British Psychological Society.

Fiadhna: Yes. Yeah. So, check that out in the new year when  we'll make it.

Robert: Well, that's exciting. That's really exciting. And so how have the team  recommended then to customers  how to use it? Because the temptation could be, wow, this is fantastic. This is brand new. This is what I want. I want to put this right up there  as the number one thing that I want to do  and measure. But it's a time when we're still gathering data on this. And so there has to be a degree of caution, hasn't there, around that?

Fiadhna: That's right. Yeah. So that's when I mentioned about limited use. So  at the moment, um we allow the users to apply a certain weighting to the various skill enablers.  But while learning agility is in this kind of early access phase, we have limited the contribution that learning agility can make to the overall score that a candidate would, that will be fed back to the employer and the candidate's performance. So that provides a level of kind of confidence and certainty that.

Robert: So there's a data point there, but it can't over influence the overall fit profile score and therefore benchmark.

Fiadhna: Because as I said, we're kind of, the building up that really powerful high stakes comparison group is really key to the kind of final picture of the puzzle really.

Robert: Brilliant, very sensible  and a lot of careful work it seems over many years to  get to this point. And then I suppose the question is, we've put it out there, you've said that uh it's been used  for some select customers to put alongside in a high stakes environment. Are you able to share  some of the feedback that has come back  from one or two of those customers?

Fiadhna: Yeah, so we've been working with a trading firm to redesign their trading profile. They conveyed to us that quantitative reasoning was absolutely key for the role, as was being able to think through a problem critically, attention to detail,  but then the ability to learn at extreme pace on the role was really, really, really  key.

So when they heard about Learning Agility, they were delighted. And then  they saw the task and saw how it operated. And they really loved the way it was kind of em the experience that the candidate would go through. And they could see how it was really tapping into the things that they would want in the role. So they have included that in their profile and could see how it would be really valuable to use that insight to decide who's going to come through to the next stage in the interview stage.

And then we're working with another like a  global accounting em and consulting firm and they were helping them with their early careers and streams. So tax audit and consulting. So learning was absolutely critical for them, the ability to kind of  navigate uncertainty, manage change. So they have included that in their profile,  but they are also going to be looking at learning agility in their assessment centre as well. But what they're likely to see then is those candidates who've performed well at the task then will be demonstrating a level of performance of learning agility in the actual assessment centre as well.

So it'll give them hopefully a really strong candidate pool then to be selecting from  at the latter stages in the process. So the early indications have been really positive. There's a real initial attraction and an acceptance and excitement about the task itself. It feels like we're bringing something  that's going to be very, very valuable and um powerful to employers  at a time when it is very difficult to identify those candidates that are really going to make a difference in your organisation.

Robert: Very, very exciting. And I always like scenarios where something like this  is going into  a recruitment process, something  like a trading organisation where it's the data that you pick up through the recruitment process.  And then  you get very quick and very clear  performance data back again in a trading environment.  There's a training element to it. Well, that's been a fascinating conversation, Fiadhna.

Thank you for sharing your thoughts and insights and all the background behind this. And for those people that  are interested in learning more, this will be coming out  very  soon. October is going to be an exciting month for Arctic Shores.  Look out  for blogs on our website. Look out for updates  from through the TA disruptors newsletter. I also will be sharing stuff through my LinkedIn profile. 

And if this is something that people are excited about, do connect into the TA Disruptors newsletter, follow Arctic Shores on LinkedIn or my own profile  on LinkedIn and look out for some very exciting launch stories and messages on something that we believe is going to be transformative in the coming months and years for how organisations are able to tap into something which is going to be essential for performance in the workplace in the era of AI.

Robert: Thank you, Fiadhana

Fiadhna: Thank you, Robert.

 

Read Next

Sign up for our newsletter to be notified as soon as our next research piece drops.

Join over 2,000 disruptive TA leaders and get insights into the latest trends turning TA on its head in your inbox, every week