Over the next several years, the data landscape will undergo significant changes. As AI becomes more prevalent, businesses will become more reliant on applied analytics and machine learning. These advancements bring up questions about strategies for maintaining ethical, trusted data, free from bias. And the application of this data to further the goals of purpose-driven organizations. Listen as CDOs explore the critical role of data strategies in today’s rapidly evolving business landscape, discussing topics from fueling innovation and transforming the business with data insights to preparing for and utilizing generative AI. This conversation will also touch on the importance of diversity, especially when it comes to data analysis and better decision-making.
Over the next several years, the data landscape will undergo significant changes. As AI becomes more prevalent, businesses will become more reliant on applied analytics and machine learning.
These advancements bring up questions about strategies for maintaining ethical, trusted data, free from bias. And the application of this data to further the goals of purpose-driven organizations.
Listen as CDOs explore the critical role of data strategies in today’s rapidly evolving business landscape, discussing topics from fueling innovation and transforming the business with data insights to preparing for and utilizing generative AI. This conversation will also touch on the importance of diversity, especially when it comes to data analysis and better decision-making.
Sarah B. Nelson 00:01
Welcome, everyone to season three of The Progress Report. I'm Sarah B. Nelson, Chief Design Officer for Kyndryl Vital, and I'm excited to have you all here today because we are going to talk about data, data strategy, data AI, diversity, trusted data, and we're going to dig into the minds of two fantastic chief data officers. So first, we have Claire Thompson, who is the Group Chief Data and Analytics Officer for Legal & General. Claire was recognized by DataIQ as one of the top 100 influencers in the UK in data and analytics. She has a vision to use data to create a better future for customers and the society we live in and she's particularly interested in how we build the next generation of women in data science in particular. So we also have Gary Burnette, who's the Chief Data Officer for Kyndryl. He's been an industry leader in groundbreaking data and technology solutions that really drive transformation and enable businesses to make smart, trustworthy decisions. He also leads Kyndryl's Native American Interest Group and has a very active involvement in the hiring, development, and advancement of Native American professionals and technology. So welcome, Claire, and Gary. I'm going to start with something super obvious and I'm going to throw it to Claire first. Why is data important?
Claire Thompson 01:25
So that's a big question. Data touches every single part of any organization that you work in and has done ever since I started out my career and even before that. Where things are really interesting is how important data is now becoming to the organization and how they're realizing the strategic importance that it has, and how it can actually enable them to do so much more. So it really does enable you to make faster decisions and to deliver better customer outcomes, better customer experiences. It really does fuel the organization. It's everywhere.
Sarah B. Nelson 01:57
So Gary, what does a great data strategy look like?
Gary Burnette 02:01
Quite simply, data is what kind of tells us how our company is performing. Simple as that. And a lot of people used to talk, and you still do hear some talk around the subject of data literacy, and I guess there's lots of definitions and sort of facets to that definition. But for me, it's just quite simply, "Are we using data? Have we designed the system to use data to force our decisions or to drive the decisions that we make around our business?" Here at Kyndryl, our CEO has been very vocal, as well as our CFO, about this notion of "The Three A's", right? How we're driving ourselves towards profitability using KPIs around accounts, alliances, and automation. All of that is fueled by data. It tells us where we are. It equally importantly tells us where we aren't and it tells us where we need to focus. So data has always been sort of the core element of driving any sort of operational strategy towards operational excellence or growth.
Sarah B. Nelson 02:56
You know, it's interesting what you're saying, Gary, about the use of data for business decisions and being able to trust that data to make good solid decisions, and which would be one use of data. And then Claire, what I hear you also talking about is other uses of data for how you enable customers to have an experience and trust either that their data is safe, that the data is correct for them, or that data enables them to do something. All of our experiences sit on top of that. I guess is it fair to say that we have different kind of big buckets of use cases for data?
Claire Thompson 03:30
Every single part of the organization has multiple different use cases that you can be looking at that all incorporate data in some way, shape, or form. From making risk decisions, pricing choices, customer experience, digital experience, all of the financial performance and how you predict that fraud, financial crime, and also with the introduction of new models within the Gen AI space is opening up even more capability because it's another model to add into the toolkit to enable you to go after different use cases that we haven't been able to tackle before. So you perhaps will have been able to automate a process up to a certain point. Gen AI is now perhaps enabling you to go that last little bit or another bit further along that journey and I think that's the bit that makes it so interesting, and the landscape has evolved quite dramatically. You know, if I go back to where I started my career and the sort of tools and landscape that was available at that point in time compared to now, and the pace at which it's changing now, is quite remarkable. It's really shifted and very exciting.
Sarah B. Nelson 04:28
It's interesting because it's exciting and obviously daunting - I imagine that there's a lot of folks out there that find it daunting. I'll share something with you all is that I decided I was going to sit with ChatGPT and sort of ask it about your bios, but it was interesting to see what it would pull. And that of course is what it's pulling out based on what you all wrote, but then also what it assumes about what someone like you is like, so it pulled up stuff that basically stripped all of your character and what makes you unique out of it. So I guess maybe the question then is when you're thinking about organizations, what kinds of data governance models/platforms do you need to have in place to navigate all of these changes?
Gary Burnette 05:11
As we launched our data strategy here at Kyndryl, we knew that our successes were going to be relative to being a data-driven enterprise. And being that, a lot of the success is going to lie in governance related topics. So making sure we understand what the sources of data are. Making sure we have approaches for measuring and tracking data quality. And probably most insightful, the lineage of how the data is being used. When we think about where we're going, we're going to build on that automation to bring in other aspects. Intermingling with topics like bias and understanding bias in that data and rectifying or remediating that bias. So again, a lot of what we've been talking about as CDOs and in the data world for some time, you know, I think all that's still important. But it also becomes foundational for where we need to take it and I think where we need to take it in this world is governance. In many ways, I think it's going to surpass engineering in its importance as we drive more towards AI and Gen AI, because it's actually going to be the management of that data and the insurance that data is trusted that are going to drive models that we need to actually drive our business. You mentioned our bios, it's a little scary to think what the thing may have said about me, but all of that is just part of getting used to the fact that we're going to learn things that we don't know, and we're going to learn it because of the data work that we do behind the scenes.
Claire Thompson 06:36
Data governance is obviously critical. I think there'd be very few CDOs out there that would turn around and say that it's not an important part of anything you do. Having those strong foundations in place are the things that enable you to move more quickly, especially in the AI space journey and AI space. Those that have really strong data foundations and good data management principles will be able to move very quickly. Also, with the introduction of Gen AI, there's more of a view about how you bring risk in a lot earlier into the process as well from a governance perspective to review the use cases and make sure that you're happy that ethically that fits with your brand, your organization, and the customers that you're serving. So how do we make sure that they're really in the process right up front?
Sarah B. Nelson 07:18
Who are you engaging in those conversations? And has that group of people changed or is it changing to than perhaps what was usually in a governance group?
Claire Thompson 07:30
I think everyone's evolving and adapting. This is a brand new capability that's coming in, so we are all learning. And as I mentioned, the pace of change, the number of tools, and the new capability coming is like getting quicker, quicker, and quicker. Certainly for me, how you come together as an organization to work through those quickly and at pace and collaborate to actually get best practice. How do you start to build an AI product that can be reused multiple times for a variety of different use cases, but you've built it in such a way that it's got the right guardrails around it that's going to actually keep you safe going forward is really key.
Sarah B. Nelson 08:06
Here's what I'm thinking about with organizations. It strikes me that you have to have both a really clear idea of what data is important, how you're collecting it, how you know it's trusted, and how it's going to get used, but now we also have to stay open to the things we don't know.
Gary Burnette 08:21
I mean, as you look at data collected, and then eventually used, even in traditional BI, right? You learn about bits of the process that aren't executing just the way you kind of thought they were. You learn about sometimes bad execution or incomplete execution, because the data doesn't tell the story that was expected. The data is going to tell us what the data needs to tell us. It's going to become our job to make sure that in telling us that insight, or in telling us those results, we've done the right work to ensure there's not bias in how that insight is derived. But if we can guarantee that or drive towards guaranteeing that, then we can get a lot more comfortable with listening to what we're being told or listening to the advice we're getting from our own datasets.
Claire Thompson 09:05
Actually, I think there is a real opportunity with using some of the unsupervised machine learning to actually identify the unknowns that you have within there. So actually you're not seeking a specific answer, you're actually using the data to tell you to actually identify the unknowns. That for me is the power of some of what we're able to now do in that space. Actually, you can flip it around and think of it in a more positive way. You can use it for identifying things you've not considered before.
Sarah B. Nelson 09:32
Because it does a lot of the more rote things. This one thing we're seeing in design research is that we used to spend weeks doing interviews, gathering the qualitative data, organizing that data, getting it into forms that we could organize it, then summarizing and validating that summary. And then that's only to get you to your top line observations. With a lot of this stuff, I was able to get a whole bunch of data from a collaborative workshop into a top line thinking in about an hour and a half. Just by the way that the tools work. But what it does is it frees us up to focus on now what are the things, to your point, that we're not seeing? Because now we have the time and the brain energy left to be able to dive into it. Okay, so all of this also is now pointing to me about skills. What kinds of skills do we need in order to be able to find the things we don't already know?
Claire Thompson 10:22
So for me, it's really important that everyone in the organization understands the role they have to play in data. Not just the data teams or the technology teams that are perhaps used to dealing with information in various different forms. It's really important for individuals in a contact center to understand how important it is and how valuable it is with the notes they're actually typing into the system. Actually, that information could be reused to help customers further down the line. Or, for example, that actually when you type that or key that piece of information in correctly, there's a ripple effect further down the line and then somebody's then having to correct that. So the more that you can do to incorporate that into the design and make sure that the quality is upfront or to educate people as to the important part they have to play. That's really key for everyone in the organization. But with a lot of these things, it's also to do with that we're actually augmenting them and helping. It's that augmented human intelligence to help them. How do we deliver them or make them more efficient in that process, or help them find that piece of information really quickly? Hopefully, free up some of the more mundane things they don't like to do. The more of the stuff they do enjoy doing more is where we sort of focus. But yeah, it's really important to think about how you're training everyone in the organization, both from a senior exec to how to ask the questions in some of the areas. What the opportunities are, how you might examine data, think about the insights, and what actions you're going to take off the back of it to the really detailed skills that we have.
Sarah B. Nelson 11:49
So how might a company that's purpose driven? How might data help it achieve its goals?
Gary Burnette 11:55
So I think it's got a lot to do with using the data to kind of help us understand where we need to be in a particular business discussion. It helps us identify trends and outliers across various cohorts who are a part of the overall model or a part of the overall business discussion. It helps us prioritize efficiency and helps us decide where to assign resources, people, software time, etc. But I think the other thing is, what data will allow us to do is also get a little bit beyond the classic A/B testing, right? Do this and see the results. Do that and see the results. And really straight on into "what-ifs"? What if I do this? What if I do that? How does the model respond? And how does it tell us what the impact of some of those actions might be? And do we exploit it, keep doing the same thing, or do we explore taking risks and doing something new as a part of what we learn from that "what-if?" So I think in purpose driven organizations, we know where we're going, we know what we need to do, and we know what we need to measure. But I think there's also some "what-if" associated with the analysis behind being that and looking at data as a way to help us sort of understand the impacts of taking one decision versus a different decision. And what that says for us relative to whatever purpose we're driving, whether it's efficiency or growth.
Claire Thompson 13:13
So Legal & General who I work for is very much a purpose driven organization in terms of, "How do we use the investments that we have to really build back society better and to support our customers with better outcomes? And how are we using data to decide which investments to make?" You know, we do a huge amount of work in energy solutions, regenerating city centers and investments in those areas. And data's involved in all of that in terms of how to start making some of the decisions. Anything we do in data, I would always turn around and say, "It has to be pinned back to what's the business strategy? What's the purpose? How is data enabling the business to actually deliver against its strategy?" That's really important. And being able to focus on that is probably the best way. If you've got an organization that at its heart has that purpose, you know that you're going to be using data to drive that really good purpose forward.
Sarah B. Nelson 14:04
You say a couple of things that are bringing to mind that there's a lot of the interpretation of the data and turning that into action. And just from my experience, everybody sees the same data in different ways. So this is where the bias is. You've got potentially the bias of the data coming in, you've got potentially the bias of the way that data is processed, and then you have the bias of the decision-making. How are you bringing in different points of view to help you get towards what might be the best decision and have some confidence in that decision?
Claire Thompson 14:34
I suppose that comes down to diversity and inclusion more broadly. Every organization I've worked for really is passionate about driving more diversity and inclusion. Something I'm personally really passionate about is particularly in women in data. How do we get more more women into this field? Certainly in the technology space, there's not enough of us in this space to add that diversity. But by having that diversity and inclusion piece through every stage then, you've got a range of different people scrutinizing, reviewing, and thinking about what you're actually trying to do. And that's how you slowly start to get to that point. And actually I think slowly things are starting to change. It has changed quite a lot since I started in my career as to where we are now.
Gary Burnette 15:16
I think also, you think about what happens when you're constructing the models, or when you're constructing the data science approach to either answering questions or creating a place for questions to be answered. Whether it's that or whether it's the sort of inference of what you're learning as the model tells you, you can't presume what combinations of data, or even data elements within that data, are going to actually produce the result you want. And so bringing a diverse set of people, whether that diversity is age, geography, cultural, gender, ethnicity, or whatever it may be, right? Bringing those different points of view often help clarify what the model needs to look like, how it needs to be constructed, and equally important, or perhaps even more important, what the inference feels like when you're looking at the results on the other end. Just an example within the sort of indigenous world that I've been a part of collectively or adjacent to my job is I read something a couple of weeks ago about just the simple term "plant". So in many indigenous languages, the term "plant" loosely translates into "beings that help us". Thinking about the medicinal aspects of a plant, or even the spiritual aspects of a plant. Of course, above and beyond, consuming it for nutrition. But someone who thinks that way is going to approach the concept of that word totally differently than someone who just thinks about the thing growing outside my window, or the thing in the veggie drawer in my fridge. They're totally different concepts, not saying one's right or one's wrong, but they bring different points of view to a question.
Sarah B. Nelson 16:51
There's a "one plus one equals three here" too. What's interesting is what you're suggesting is that, I'll just speak for myself, perhaps coming from a North American point of view in a consumer society, I think about plants as something that I consume and interact with. So it could be that I buy it from a garden center and I plant in the ground and it looks pretty. Or I eat it and I do something with it in this utilitarian way. But you're also adding on is another dimension from another point of view that says, "Yes, all that's true. We do eat it, but it's giving. We're in relationship with it." I think in innovation, we're always looking for something where you take things that are seemingly unrelated and then put them together, and it makes something that wasn't possible before. So that's another advantage of having these different voices in there is that it can drive innovation.
Claire Thompson 17:38
I agree with you. It's that different perspective. Bringing people together to look at a problem or a challenge that exists within an organization and re-thinking completely outside of the box how you might approach this. And if you were starting from scratch, how might you reimagine a process? I've been very fortunate in my career to be around some of those where there's been new ways of thinking. New things have come along as to how to automate the process even further. And some of those will have come from people being in the room and coming up and challenging each other in an appropriate way to sort of go, "Well, what have you thought about? Or what if you did?" Or challenging assumptions as well. So I'm really passionate believer in diverse teams. They can be challenging to manage because you've got lots of people coming from different viewpoints, but hugely rewarding as well. You get to learn a huge amount and see how someone else actually views the world completely differently from your own.
Gary Burnette 18:28
It's really easy to say we should have diversity. It's a harder question to say, "How do you get it?" Regardless of what you're thinking of when you think about the diversity. I would put this out there for data professionals, those of us in our companies as well as those that we interact with. I think it's incumbent on us to kind of help make sure there's actually a pipeline of talent out there for us to even draw from. You can't choose diverse candidates if there's no pipeline for it. So that might mean we need to get involved in things kind of outside of work or tangential to our jobs that say, "Okay, how do I inspire women in code? How do I inspire indigenous folks to get engaged in STEM related activities?"
Claire Thompson 18:39
I would agree with that, Gary. I think even going back into schools and our education system. How do we help younger generations understand the potential of the career opportunities that actually exist in this space?
Sarah B. Nelson 19:26
I'm curious also about how you as leaders are empowering and inspiring your teams to leverage the data in new and innovative ways? Or where do you see the opportunities for that?
Gary Burnette 19:36
People should be encouraged. People should be free to think, "Hey, it's alright if I bring up a question. It's alright if I challenge a particular point of view." Because in doing that, we all arrive at an answer that's better than had we not done that in the first place. So I think as leaders, a big part of this conversation especially as we drive off into discussions around bias and all that other sort of stuff is just to make sure people feel empowered to actually speak up and make a point.
Claire Thompson 20:02
As a leader, creating that safe environment for teams to feel that they can voice their view or their opinion, even if it may be the one that's a little bit more divisive or a little bit more out there. It's really, really key. I'm bringing those teams together. A lot of solutions, certainly in the data space, it's not just about the data to build the best predictive model in the world. But unless I'm working with others to implement that into a system, or somebody else is changing the operating model, or the customer experience at the other end, it doesn't go anywhere. Data is pretty much always an enabler. And therefore, it's the group of individuals that you're bringing together with all of the variety of different skill sets. The SMEs, the data engineers, the front-end UI developers, the data scientists. It's bringing all of those together that makes the solution actually happen. Bringing those people together and having that diversity and enabling them to have that space to try and experiment and learn. I think the data science use cases lend itself really well to that. To create that environment where it's like, "Well, we're going to run really quickly. We're going to try and test this. We think we've got a hypothesis as to what the outcome is going to be." And actually testing that and taking the learning. When it doesn't work, actually, it's just a learn. And we'll actually stop that and move on to the next one, because it doesn't work in a way, or actually, no, that really does work. And it's amazing when it's like, "Let's keep going to the next phase and get it into production."
Sarah B. Nelson 21:25
If you could give a person one piece of advice, someone who's building and who is responsible for data in their organization, what would it be?
Claire Thompson 21:33
I think it's really hard to choose just one piece of advice. And the reason I say that, having worked in lots of different organizations, culturally and technologically, individuals in those companies are in completely different places. And actually, what they need to help them succeed in AI may well be quite different. And that's why I think it's quite difficult to turn around and give one piece of advice, because there's actually lots of components that are needed to actually make you successful. From the data foundation, so the skill sets that you need, the processes, the frameworks, and the governance. There's lots of it. And it depends on where they are on their data journey as to what it is that's actually going to need them to help. So I suppose if they don't know, I suppose the piece of advice would be seek out a data professional, somebody who really does. Then go and sit down and listen to them. And ask them how best you get started on that journey and what it is that you could do.
Gary Burnette 22:33
The best advice I could give is just make sure you understand two critical things. Number one, make sure you understand what the expectations are about your data set. What do you expect the data to tell them? Remember, data comes from those executing in general process, right? They're working in a tool. They're creating artifacts that become the data that we build our insight around. So I think it's two things. Like I said, one is really understand what the expectations are around your data and how it's going to be used. And then also understand how it's being created. Make sure that the creation kind of models the potential use. And look for those places where you might have gaps. And look for those places where you could introduce some level of bias in the conversation because you're doing an incomplete job of collecting or using the data. That would be kind of my advice.
Sarah B. Nelson 23:28
There's so much happening, and not just in data. In the applications of data. Data is everywhere. And now that we're able to get data from so many places, it both opens up tons of possibilities and then it also really puts the needed discipline. Why are we using data? How are we going to use it and is it the right data? How do we trust it? So it's both very exciting and daunting to me at the same time. But I really think that organizations who put really clear attention to the data governance, I think that's going to be an amazing competitive advantage as well. I really enjoyed this conversation. I hope that you all did too. I'm so appreciative, both Gary and Claire, for joining us and just really probing on these issues together. Thank you so much.
Gary Burnette 24:12
Thanks, Sarah.
Claire Thompson 24:13
Thank you.
Sarah B. Nelson 24:15
If you've enjoyed this conversation, be sure to like and subscribe and share it with a colleague who might also be interested. Thank you.