As people join, thank you for joining us, this morning or afternoon, depending on where you are in the world. I'm Ollie. I'm one of the, cofounders and CEO at count dot co. I'm really grateful to have you with us. With this is gonna be a session we're talking about why clarity is more important than AI in your data stack. If that's not why you're here, then you should leave because that's what we're gonna be talking about for the next forty minutes or so plus. A lot of chance for discussion and questions. So today, as you as you probably imagine, we're talking about AI. And my goal our goal today really is to try and, throw some realism on the hype train of AI. Not that doesn't mean we're not considering AI and how important it is, but we're just really trying to make sure that we're discussing what really matters and where our AI can be most helpful in the data stack. And to do that, we've got amazing panel with us today, panel of people who are at the forefront of leading their organizations through this journey, thinking about AI in the context of their business and also in their data organizations and also building products in the data space, to try and bring AI in at the right time, the right right way. So my goal for this session, the goal we will be talking through is really to help you, everyone here to build a better mental model about how to think about AI, how not to lose, complete like, have a very clear mental model about what matters, how to go about the change, how to think about it, and hear from real people thinking this through out loud to help you understand that you're not alone, that maybe you're not as far behind as you think you are or that there's many areas of opportunity to go after. That's really the goal, a really open conversation. And we'll definitely make sure as we're talking, we'd love you to hear your questions as as they come up. We'll try and answer them as they come, if they feel appropriate, but we're also gonna leave lots of space and time at the end of this for questions, that we'll definitely answer at the end. So make sure you stick around for that. We'll also be asking you to give us your questions and feedback through polls and just make sure this is dynamic and as useful for you as possible. Before I to stop my talking more and more, let me introduce the panel of people we've got here and who are gonna be talking this channel this topic through with us. Firstly, we have, Jessica Franks. Hello, Jessica. Jessica is and I'm I think it's the right date head of data analytics and engineering manager at Not on the High Street. So, I'm gonna do a you are one of the forefront brands, particularly in the UK, for kind of ecommerce, a kind of a a boutique re reseller of boutique materials. Is that a good way to put it, Jessica? Or was that a terrible That was terrible. Okay. No. Fair enough. Absolutely. Do you wanna give us a better explanation of what the high street does? So not on the high street. It's an online marketplace for small businesses or small brands within the UK. So they're able to sell their either handmade or curated or designed items on our platform. And there's lots of really cool things that you can go and buy. It is very cool. We despite despite though I can't describe it, I do use it, and we do shop there as a family. So thank you for that. Secondly, we have, Michael Rogers. Michael is the head of product analytics at Bumble, which I can definitely say confidently is an online dating app. That feels like a good description, Michael. I assume you're not using us? I have never used you. I I'm I say to my wife all the time, I'm very glad I wasn't around in the hour. I mean, we found each other before dating apps took off. I would not do so well. Well, we're we're hoping to change that. So, yeah, lots of very cool things coming out. There you go. Thank you. And then finally, we have Mikkel Denso, who is the cofounder of Sync. Sync is a, up and coming, data observability platform. Mikkel, I hope that's a good description of that. Do you wanna give a better pitch? No. You got it right. Yeah. They also have a business platform working with a bunch of cool companies. And then prior to that, spent a lot of time in the UK, tech scene most recently at, at Monzo. There you go. So that's that's more. Well, thank you so much for being here, everyone, and being willing to share your thoughts and discussions on this topic, which is obviously really, well, is the topic of the month in technology, definitely. So I I guess with this, as you can see, the panel we've got here, we've got people who are are leading teams, building products, both for their end users or for data leaders and data teams to try and engage with this topic. So we've got a really broad panel. I guess what I want to do just to set the scene and just to give you a sense of how I wanted to to manage this discussion, I wanna use a bit of time now to just sort of with with you all here, just talk through, like, where your head is at right now with AI, how you're applying it. Just help us sort of set the scene about how you're thinking about this challenge, and what we're seeing more generally just to help us on like, get ourselves a grip of the grip of the hype and see what's happening day to day. Then I'm gonna we're gonna start talking about actually the challenges of data teams more generally, and we'll use a bit of an audience poll to help us steer that conversation and really make sure we're landing back into what is really what is really matters, what what is the fundamental problems we've gotta solve as a data, community data industry, how does AI apply to that then, and then start looking forwards into, like, the like, what does it mean for our teams, our skills, how we think about opposition in the organization. So that's kind of a very high level narrative I want us to go on over the next sort of, like, thirty five minutes or so. So maybe to start off with, let's just do a bit of a round table and just get a sense of how you're thinking about it. Like, what's your emotions when it comes to AI? How are you thinking about it? Like and maybe how you're starting to explore it in your own work or as an organization? Jessica, maybe we could start with you. Do you wanna give us a sense of, like, what is or how is your head? How full is your head of AI right now? How sick of it are you? How excited are you? Nervous? All the all the above. And then maybe how you're applying it as a team, as individual, as you're starting to bring into your day to day work. Sure. So I guess I I'm both, like, excited and terrified by AI. I think that how fast it's evolving, and every week there's something new that somebody's releasing, and you're just like, oh my gosh. How much better can it get, and will I even be employed in five years' time? I think that's where the worry is coming in that you can see it, like, automating a lot of things that people were doing before. But the way I like to think about it is it's gonna get rid of a lot of the tasks that we're taking up a lot of unnecessary time. In my team, I've encouraged people to use and explore things. Like, we use the WinSurf browser to kind of help with some development tasks, and it's really good. Like, I used it to build a web scraper. I mean, maybe we shouldn't be saying that, but, like, just for I needed some data off of a website for a project we were running. It's, and something that I would have had to, like, take a couple of hours to build before it built in a couple of minutes. Like, here is the this HTML structure. It quickly built something, and it was debugging at the same time. And that's when my mind was blown, and I was like, oh, okay. It's a a bad time for junior engineers because now if I can just give really clear, concise instructions to an AI, it's gonna get things done for me. So we're using it in, like, to help with some very small kind of development tasks, write code for things that are really easy for an AI to understand. And as a company, there's a couple of tools that, like, we use Gemini is incorporated because we use Google Workspace, so we've got Gemini incorporated in our meeting. That part I love. Taking meeting notes, like, I don't like, I can't do two things at once sometimes, so I can't listen to someone and write quickly. And so the fact that AI can take meeting notes now, like, that that is the best use case, I think, so far in companies. We have a couple of external tools like our browse and our search on our website is powered by Google's AI solution. And then we're starting to slowly experiment as a data team on how we can start incorporating AI into some of our workflows. So we're trying to understand, do we wanna, download models and use them that way? Snowflake has some built in AI capabilities that I've done a small POC on recently to try and classify some some different product data that we've got. And this is also making people in the business excited. They they keep asking, what about this? What about this? I'm like, it's kind of expensive. So we have to be really careful about what we wanna do and how we use it because, yes, we can just go ahead and do it. But is it a good use case for that? Like, we wanna understand whether it's worthwhile or if it's just, like, that was cool. Like I said, I used AI to do something. So, yeah, very beginning stages, we don't have, in like, we're using other third party tools that are way more advanced with AI. But internally, like, we're only starting to scratch the surface of how we can use it and if it is even feasible for us to use or, you know, if a good old traditional ML model is gonna serve the purpose just as well. Essentially, I wanna come back to that because I do think we forget that the definition of AI maybe has changed in the people's mindset, but actually there's been AI for a long time when you think about ML. Michael, how about you? Like, I love maybe you got same question to you. Like, how are you thinking about it? How's your excitement levels, anxiety levels feeling, and then where you started to use it? And maybe, you know, I know that maybe that's more personal than anything else. Yeah. So, I mean, it is a game changer. Like, we had we have to be honest about that. So very excited. I think we'd pro I'd probably bucket into four areas. There's one, like, individual, so just get stuff done. Like, you can now do more. So so like, the windsurfer stuff, cursor, those kind of tools. There's the product ties, the work we do better. So, like, we can make the I think the team can now be more ambitious on the things we want to do purely from, like, a analytics data science perspective. We can, like, really raise the bar on what we wanted to do. I think then just generally, there's some, like, foundational stuff around as a data platform, as data scientists, like, how can we use unstructured data more? Semantic layers probably become more important. How do we start to invest in that stuff where we know we can just then, like, AI can do more with our data, that sort of thing. And then there's, like, horrible phrase, but, like, the adjacent possible. So what is what is going to be unlocked that we couldn't have done previously? Yeah. Let's think about that stuff. So that's sort of how we the kind of four buckets we're thinking about. Those are really helpful buckets, actually. I like that. Like and, again, you're thinking very much like thinking in my mind, so I'd just play it back to his very use case focus, be that big use case or very tactical use case. Same with Jessica, actually. Very much thinking about applying as a tool in the right places. Yeah. I think this is the lucky point from, like, working in or working very closely with product teams. You have things like the job to be done or Bezos talks about, like, the thing that doesn't change. You've got the a lot of talk around, like, you know, clarity being the most important thing. So what is the stuff that doesn't change? Let let's start there, and work backwards from that. Yeah. That's cool. Mikael, finally to you. Thank you for being patient. I mean, you're obviously building products for data teams to help them to understand what's going on in their data stack to have that kind of sense of clarity. Like, how is the kind of you're I think sync, if I'm right, has was sort of you've been building sync for a few years, but, actually, maybe just as AI started getting getting going, sync was kind of you're you're very much building product very much at the early stages of sync at that point. How has it changed, like, the vision you had at the start on how you think about AI being part of the the mix? Is it changing everything, or is it just another tool to add into the mix? Yeah. That's a good question. I I can give you a more conclusive answer in a in a few months, but, we we've sort of spent the first three years building out of the foundations, a lot of the integrations, a lot of the systems behind it. And, actually, a few months ago, we start spinning up a new product called Scout. It's gonna go live soon, and it's essentially using all kind of information that we have about your metadata, about your code commits, and all kind of other stuff to help you do some of the data grunt work better. So some examples are, for instance, when it comes to writing relevant data tests and DDT or whatever tool you use, what if you could just specify that as a bit of a recipe and a markdown file and a document, and then the AI could generate those kind of tests. That's something we're seeing actually working quite well already. And then the other thing, which is very timely, is so we we have now a debugging engine built, using AI. And Pedda, our CEO. He did our product demo on Friday, and I made a code change earlier in the week to a DBT model that broke one of our monitors that went off. And I was very honest. That's very honest. Yeah. This this AI model completely debug the issue. It got to the code change. It understood the context, and it basically set this as an intended change. You can just accept this test as a a resolved instance. So I can just kinda see this whole under the surface workflow completely changing in, like, next six months to a year. So very excited as well. That's cool. So excitement generally from you. I think I think what's really helpful in the way all three discovers is just very much thinking about I think Mike said it well, the job to be done framework. But, actually, what we're describing is workflow changes with describing taking sections of very painful work, be that which ultimately coding is a very manual process at the moment or is and making that much, much faster. And then you're just playing back for the audience's sake then that you're all exploring maybe more early bigger changes of workflow and maybe automating larger things in a bit more of a cautious earliest step. But you're already executing on key what's key tasks right now, pretty much straight away, which is amazing to think. Like, the the smaller task, the more tactical tasks you're kind of cracking on with. That's really that's really cool. Maybe then, like, the question I would wanna come back to you all is just to help again finally sort of set the scene here. What are you most excited about? What do you think what do you see as the potential accelerating forward a few years, maybe in further? Where do you see AI totally changing the way we do things? We've discussed very much some very powerful use cases of building a scraper, building debugging tools, but accelerate further forwards. Where do you think and I know it's a speculations. I won't hold you to it in two years' time. Where do you think we're gonna be? What what excites you most perhaps about the idea of how AI could change things for for data teams? I'm happy to go. For me, it's like velocity. Like, I just think it's so exciting. The speed at which you can, like, get stuff done, prototype, try things out, build things, like, simplify the message you're giving to the business, all that stuff, you could just do break next speed, and that that is the name of the game. I just think that is very, very exciting. From my side, I think technical skills are gonna become less important because AI is just gonna be your buddy on that. Like, if you don't know something three sentences later and you do know it all of a sudden, and it can help you figure it out and explain it to you very well, which is something you don't always get. Like, the ability to just have someone explain why something works in a certain way, especially for people who don't like asking, in quotes, stupid questions. They, they're more comfortable speaking to an AI about it. So, like, they they can have a a conversation. They don't have to waste somebody's time, but I I think technical skills become less important. And now those, like, more human skills of learning to communicate really well because you have to communicate with AI properly to get what you need out of it, Asking the right questions, asking the right questions of the business, actually getting, like, very clear understanding of what are the problems you're trying to solve. And this is just another tool in your bucket to kind of help you solve that problem. So it's, yeah, less less technical skills, more more kind of human skills are be can gonna become way more important, which is ironic in the case of robots are taking over, and now human skills become more important. But Yeah. Play to our strengths. Stop Mikael writing DBT code. No joking, Mikael. Sorry to pick pick on you. Yeah. No. I think I I don't pick the the same too, but maybe one more that stands out. I've often had this conversation with people, like, what's the kind of ROI or the impact of a data team. And I think the best example I've seen to date here is at at Munslow, one of our, very good data scientists, the NICE, she launched this machine learning model against fraud. And over the course of a few months, we saved many million pounds in fraud losses by pretty much the work of one or two people. And I think at that point, the CEO came to us and she's like, can you please hire ten more data people? And I think that this is a this AI thing that's happened really lets more and more people get the chance to kind of chase these opportunities and and really kinda have this super clear ROI. Mhmm. I love that. Yeah. Yeah. Michael. I I was gonna say, I think the the technical one is, like, it could go either way because you could get super technical people that can now build, like, agentic stuff, and, like, they can now do a ten x developer becomes a a thousand x, and they can just do way, way more. But, equally, you maybe get over the meme of the data scientist used to be, like, just a bad coder but good at stats, that sort of thing. And, you know, they cannot uplevel. So, yeah, I think both are gonna be true where it's like technical skills less important for some, but also some people might just eat the work and, the jobs for of a lot of other people. I love that. I I had a one of the things I I spoke to one of my friends is, much more successful than me. He he's one of the youngest partners in private equity. And I asked him about how at a macro level that they are thinking about AI. And his very similar model I wanna share, which is that he basically said, AI is a productivity game. Ultimately, it's like any technology. It's a productivity, opportunity. The more you apply it, the more you can reduce costs and save time until it becomes general and general intelligence, then we're all screwed, which I think was a bit it's a bit tongue in cheek. But the the way he put it was that like any technology, the the question is how how dramatic a change do you level up? When you think about, like, say, a job to be done, you can obviously increase the speed of, writing the code, for example, or you can say, I'm gonna build an agent which does the entire workflow in one go. The further down the automation path you go, the more productivity gains you're looking for from one intervention, the higher the risk is gonna go wrong, the higher the risk that the the players the play doesn't work. And I think that's a really helpful mental model for us as we think about applying AI into data into the organization. It's it's you could try you could shoot for the moon and go for, we're not gonna hire anyone else anymore. We're gonna go for agentic AI workflows all the time, but there's no risk that's gonna go wrong that we're not gonna fully execute that yet. We're not mature enough for that. Or you can implement more step a more of a steady state approach and just get a two x, a three x improvement on productivity. And I love that mental model because it helps you rather than just looking at AI as one massive revolution, actually, and you're thinking about it as a productivity game where I can go for moonshot deployments or I can go for more of a still very dramatic but incremental improvements where there's still a human involved, for example. And I think that's a very helpful mental model to think about the the risk versus reward matrix effectively in terms of how you're thinking about use cases. I don't know if this is the only thing that helps you for work out what to what to put AI where. It doesn't give you any sense of where to apply it, but it does give you a way of thinking about it and then you're playing back to the business that the bigger we go for, the more time we're gonna need to get it right. I don't know. I I have the question I'd love to ask. And then maybe we should move on to think about the actual problems of data teams. But how we how is the as a team or as an individual or as an organization, are you thinking about that betting on use case and where like, at what level to apply AI or start thinking about applying AI? Because you could be like, we are now an AI business. There's no humans. Obviously, that's the full extreme. And you're obviously already playing around with, like, making my day to day job better. How are you is there any kind of strategy about how how to apply I at what level of what level of impact? Or is it just kind of as good ideas come, we give it a go? Maybe Michael, go ahead. Sorry. Go ahead. Sorry. I oh, yeah. I think absolutely start with, like, the get stuff done, Ruth. Like, how does it help us get what we do today? Let's start there. But I think there is this new way of doing things or, like, like you said, you could even if you went fully agentic, like, that is just automating the current thing that exists. So I was just like, what's the new thing that is now possible? Either, like, more ambitious or just a totally new thing. That is the thing that we wanna spend a lot of time thinking about. And even if it might like, we don't know what the answer is, we'll be like, okay. Well, what's the foundational stuff that we can do now that will make it easier in this world that is undoubtedly coming? But, yeah, start with, like, get your house on all our foundational things, get stuff done better, and then kinda just see what what happens. Love it. Yeah. I agree on that. Like, definitely did get get your data in order. Make sure your data is tracked and your data is there. But I think also the on the way of working there, that Toby, the CEO of Shopify, she had this internal memo that been viral this week, and he basically said, if you're in Shopify, you're now expected to use AI as part of your job, and your performance management is gonna be based on that. And I actually think there's a lot of sanity in that that you should expect your team and everybody there to, even in their own time, whereas they work, keep up with these skills and try to use tools even in in single player mode, which could be Cursor, Captivity, Gemini, whatever it is. I think that's a the good ideas might not necessarily come from the people leading the data teams. It could also just come kind of roots grassroots up from people trying stuff. So I thought that was super clever. Yeah. That's helpful. That's a good interesting memo, isn't it? Like, just the expectation that you should at least be have the mindset of wanting to change. I think it's ultimate what that's ultimately pushing for is I wanna change mindset that we are looking at AI actively, proactively, and not being defensive around it was kind of really cool. Jessica, any any other any thoughts from you about how you're kind of thinking through the impact or change? So we haven't done, like, any formal, like, AI things. There's been, like, a lot of side discussions and, giving people access to things like Gemini, and there's, like, guidelines around how you should use AI. And I'm pretty sure they're being ignored because they were written quite a few months ago. But I think kind of to Mikael's point, he was, like, now ideas are not just coming from a few people within the company. More people are exposed to AI. I think Chap GPT is has been really good for that because now everybody has used AI at least once, even if it was just to generate a picture of what your dogs look like as humans. Like, everyone is more exposed to data and, like, the possibilities of of this, like, new technology, and they're using it in very creative ways. So I think engineers tend to be very, like, methodical of how they're using things, but then you have a more creative personnel that has the exact same tool and more ideas are gonna be generated because somebody who has a different way of thinking is not using, and unlocking the power of AI. So it's more about, like, encouraging people to experiment and try it out and giving them prompts they can use to, like, help with certain things. But, yeah, like, even in my team, not everybody uses AI for everything. I keep having to, like, why are you struggling? Let's just throw this into an AI, and it'll generate the document that we need quickly. It, like, formulate your thoughts in a very, like, constructive manner. Like, don't be afraid. Just go back. It exists. Let's use it. Don't you don't have to keep falling back on old ways of doing things. So it's yeah. I think we're very, very early in the AI journey at Nance, but, it's exciting to kind of see how other teams are are using it and thinking about it. Yeah. I think that's that's really helpful. I mean, ultimately, there is I said just to defend it back to the mental model, like, AI is a technology. You can apply that technology well or you can apply it badly. And, certainly, the amount of noise that's going on around it makes it very hard to pick apart what really is a good deployment of AI and what is just not not not a lot, but I don't think pictures of humans, dogs as humans is a bad use case, but it kind of doesn't necessarily, it's not gonna help you move the needle on it like on the ice cream. That wasn't what you were saying, but just to just to make it a trivial example. So I think that's what I wanna use this more more this time this discussion is actually get into the problems of that their teams are facing and then start thinking about how AI or not is gonna help with those things. Because I think otherwise, you just got this cool technology which we love playing with and actually just having, like, focusing down on really matters is really is it can be really helpful to help start thinking about what, like, what how do we get into this process and how do we make the most value from AI as fast as possible. And also where AI may not be the right thing. Because I think the old part of this is that AI based technology, you can there are some things which, which some problems which actually can be solved whether using AI or not, which is also so much just can be easy to forget when this kind of hype phase of the technology. So what we're gonna do to the audience, we're gonna give the audience a bit of a poll about what they think is the biggest challenge facing them right now in, like, driving value from data in their organizations. So there's the poll up for you now to to to go and click on an answer. And while that's happening, maybe we could kick off with, with Mikhail. Maybe you could give us a bit of a a sense about what you think what you're seeing maybe from the market as the biggest blocker for organizations to use data well, and then we'll come back to the results since they pick it apart from people later. So, Mikhail, while the while the poll is building, can you give us a sense of what you think is the one of the biggest blockers or blocker? I suspect you have a high a hypothesis. Yeah. Yeah. I think d DBT also did the survey. They do it every year where they ask, I think, several thousand people, what are your biggest problems and block us for for dealing with data? And and the tool that always comes out at the top is data quality and data ownership. And I think that they're super much tied together because the ownership is also problems coming from upstream teams and with data quality from source systems and the one. So, like, those two just consistently comes up, very, very high in the list. And a few of the root causes for why I think that's happening is, one, data teams and and the complexity is just getting really big. So it's not uncommon for data teams now to have many hundreds of tables of DBT models, or in some cases, thousands or tens of thousands. And and months, so I think we at the time I left, we were more than a hundred people in the data team, and just that complexity makes things very difficult. And on the other hand, I also see more and more teams use data for things that are what we call business critical. So instead of kind of exploits for ad hoc reporting, I was being to a hedge fund last week. They do trading every day based on their data warehouse, and if they get one day of data wrong, they lose millions of dollars. So I think the data is very big, very complex because it's easy to build. It's increase increasingly used for very, very important stuff, and and that kind of combination just leads to a lot of tricky problems that we haven't quite solved. Thank you. That's really helpful. Jessica, how about you? What's the thing you're seeing, but you or as a team that you're most thinking about in terms of the biggest block of the data that you you see in the market? I think with within our team is, just going kind of on some of the results that are coming in from that poll. But, like, we the biggest problem we're trying to solve right now is getting everyone in the business to talk to each other. Because as a data team, we talk to different teams. Mhmm. And then you start to see, like, conflicting priorities from people, and they wanna, like, they wanna do this initiative. And this person's doing this initiative, and this is gonna drive this KPI, and this one's gonna do this KPI. And then they conflict, and you're like, you all need to talk to each other because you're working against each other unintentionally. But, like, you want me to measure this, and you want me to measure this. But you're gonna be disappointed in the results because you you're both working on different things. So data teams are because you have that overview of the entire organization, well, at least in a small company. I I think in bigger companies, it's obviously a bit different. You you're speaking to all the teams, so you have this, like you're starting to become that single source of truth, and how do you get everyone on the same page and working towards the same goal? Because if things aren't, like if nobody understands the the impact of the things that they're working on, or how it impacts somebody else's thing that they're working on, then you everyone's gonna be disappointed at the end of the day. So, yeah, it's we're we're moving away from focusing on, let's get the data platform in a really good state to how do we help solve some of the business problems. I love that. Does the business even know what those problems are? Yeah. Yeah. Help them find them. That, I think, is a good example of where clarity is the most important thing. That's really helpful. That, and you're right. That as a data team, you have that unique position of having all the visibility of the whole business, at least from a data perspective, and seeing the conflicts is a really good point. I'm gonna gonna I'm gonna move on to Michael. What's, Michael, what's from your perspective? I know the poll results are here, and it's tending to use them. We'll come back to that in a second. But what about from your personal perspective? What do you often find as the biggest blocker? Yeah. I think we're we're in a very privileged position where, like, the business comes to us a lot, and they they really look to our team through a lot of insights and help and and that sort of thing. So it is very much, okay, what what Mikkel said in terms of, one, the complexity. Like, there's just data comes in from a million sources, different formats, all that sort of stuff. How do you make sense of it, and make it, like, more meaningful and trustworthy and reliable, all that good stuff. And then people wanting to use data in more sort of, like, almost production operationalized workflows, and, like, that's very different, and maybe where the world is going. Like, we're supposed to now be more like software engineers, maybe not. But, yeah, becomes a big challenge of, like, what are we actually producing? Do we need to start thinking about, does someone need to be on call? Like, for this sort of stuff, is this like a production workflow? Like, this is kinda crazy. So, yeah, lots of stuff like that, which is great. That's very cool. I probably put you in quite a unique position to be the the trusted adviser of the business is a pretty cool place to be. Well, let's look at the polling because I I what we've discussed here I mean, I mean, broadly, what we've what Mikkel said, what is kind of, like, the biggest problem is the complexity of the data stack from the back end from a data perspective. So many different assets to manage. Jessica, you gave me a complete the same kind of problem, but from a completely different from the front end from, like, how the business is engaging with data and helping the business understand itself really well. Both those two things are big problems and they're and back to the theme of the webinar, they're about clarity. Like, the idea that the better the business the better as a team we understand our data stack or as a business under it understands itself. That is the way to get to different better a better paradigm of operation. If we look at the answers, I'm gonna say I'm actually surprised how flat these answers are. It's a small survey in many ways, and I don't wanna this is obviously not the whole market, but we've had basically low data literacy and no single source of truth on the top, but top thirty percent of each. But then data quality and too many ad hoc requests are then pretty much as close close second. So basically saying these are the biggest challenges. When you look at these problems, things like low data literacy, it's very tempting, I guess, for us to look at that and go, let's get an AI chatbot for data requests to solve bit adequate requests, get people asking questions of data. That's the obvious. The chatbot is the is the king. Let's just implement that. Has anyone attempted to do that yet, or has anyone had a had a had a thought about whether that's the real root cause of this problem? I think that's what I often I see. I see loads of AI chatbot demos on LinkedIn all the time, and I know, and I'm pretty certain that you may know that when you use these things or if you don't fix fundamentals, that is gonna run out of steam very quickly. Yeah. I was gonna say I tried using Lucas built in conversation thing they have now, and all I asked was how many orders did we have yesterday. And I was like, oh, I'm still learning. I can't answer that. And I'm like, okay. But that's, like, the one basic question I would my users would probably go ask the data. So is it because the metadata we have around our data, look, isn't sufficient enough for it to answer that question? Or is it just a really shitty implementation of AI that's, not not useful for this use case? So I think a lot of tools are just putting AI in for the sake of having AI, but it's not actually solving a problem. Yeah. I think that's been interesting. I was playing around with Omni, this new Looker, compared to last week, and we're using it internally. And it actually has AI where you can ask them to create, the formulas, like the calculated fields and derived tables, which I always found super painful in Looker. And that works super well. So it feels like that kind of a confined problem space that that's a really good area for it. But I think as well, if you imagine solving any ad hoc requests from a a chatbot, I think that's still a little bit out at least. Yeah. I think if it's trained for a very specific problem, then it's gonna do it well. But if it's broad, just like you can ask AI a question and it hallucinates the answer, but it's very confident about the answer. And you're like, yeah. Of course. It's this way. You're like, like, how many hours are in the weird strawberry? Like, that's the one you always see on LinkedIn going around. And it's very confident when it gives the answer, but it's not necessarily the right answer. So if you do start introducing these tools in, what are the checks and balances you have in place that is not gonna give somebody the wrong answer that they then go and use that to make a decision? But because it didn't have the right data or, the question wasn't phrased in the right way, it's just gonna give you a very confident, oh, there's five. Go do that. And then somebody's, like, made a whole business decision based on, AI not giving them the balance. I think I think this is, like, a massive, massive point when, like, we too well, like, low data literacy. Sometimes it's not people's fault. Like, there are books written about, like, how to lie with statistics. So this is a hard really, like, hard problem. And so to your point that if someone makes a mistake in engineering, the website goes down, you kinda know about it very quickly. But if you make a bad decision, like a business decision, you might, like, open a new market or, like, open a new product line based on this, and the feedback loop is gonna be really, really slow. And so think this is, like, our role, you know, as Ollie said, like, the trusted adviser. It's like how where and how can we create a contained space where people don't need high data literacy? It can just be, like, a count of things, like an understanding of what's going on. But some of the more complex questions that people are gonna throw at this stuff, it's gonna be presented with great confidence, and we all love having, like, you know, confidence intervals and that sort of thing. And then you always get told, get rid of those. Like, just give me the number. And I think that is our job. It's, like, how we manage that level of uncertainty that LLMs and that sort of thing just remove. Yeah. I love that. That's really that's really helpful. It is we have a job to do here. We are stewards of the information in the business, and that kind of I if it was have me speak before, I thought about signal to noise ratio is the key thing. And actually signal to noise ratio doesn't come from producing answers faster. It comes because it actually just floods the information. It's actually about producing great answers with more with more focus and more and and and giving you a better understanding what's really going on. Had a good question actually, but I might just throw this in right now because it's a good question from Angus, which may just help us dig into bit a bit more. He he asks, which of the four problems in the poll does AI solve the best? And I think that's ultimately the question, which is, it's actually quite hard to know, if you think about it at this kind of level of detail. Anyone got any anyone got any way so you would you would tackle the answer looking at these different questions? I think you could answer. There's definitely bits of all of these that AI could definitely help with, but it's not like it's just a a silver bullet to any of them is my take. And then we've got a anyone wanna challenge me on that and give me a better answer? I don't know if it can or to what extent it'll be able to do the low data literacy that well. So I think this the ad hoc requests, like, something like a cast the doc or those sort of tools where you can just say, like, answer a question. I mean, I expect that to be in a few months, like, everywhere. And the single source of truth like that building in what's becoming very sexy around, like, test cases and, data contracts and all that sort of stuff, I expect there to be, like, a lot a lot of work there that AI could could help with, but it helps you get the thing done, not do it itself. Because I guess everybody has a different interpretation of of those things. So data quality means different things to different people, and how deep you go into fixing data quality, at what level, and what is good, what is bad. Like, there's no clear definition of this is good data quality and this is bad, good data quality. And sometimes you think it's good until somebody points out this isn't quite right. And you're like, oh, yeah. You're quite right for that use case. This isn't quite right. So it's all very dependent on on the problem you're trying to solve or the business area you're in. Low data literacy at one company is not the same as low data literacy at another. Like, some companies, nobody even knows how to use Excel, whereas other companies, they excellent at VLOOKUPs and but they're not comfortable using SQL as an example. And, like, everything is is different levels and is very dependent on your space and dependent on your company. So I don't think there's a a good answer for any of them. Yeah. I think that's spot on. I I basically think that the categories are too broad to give a, final answer. I'd say, like, if you look at data quality, kind of debugging root cause analysis, yes, AI, I think, that'd be a big role. But if data quality is, like, do people input the right field up in Salesforce? That's a that's a human problem. I think for each of these, you would have five to ten different subproblems. And then within those, I think it's more easy to say which of those can AI help with. Yeah. I think that's really helpful. I I think it comes back to this idea that if I was to distill down what we've discussed is, like, where is AI working really well, but Mikkel's suggestions, Jessica, your your ideas, it's either as a productivity gain, I can go from a to b. I know with the in a well defined task, I can jump there very, very quickly. Or it's a it allows you to remove artificial technical barriers like how do I write a level of detail function in Tableau or Looker, you know, jump over the syntax hurdle that might exist that stops me coding. The question ultimately is, which I think is back to, like, the essence of value, is if and wait a bit. The thought model, I should say, is if I is the ability to write this code in this language or my ability to produce more widgets per second the fundamental answer I need for this problem to go away. And I would argue that there is definitely bits of that, which is true. If there are too many ad hoc requests for the business and they can be solved by getting a to b faster because you can code quicker, then, yes, that's a great solution to the problem. It's definitely part of the solution. If the barrier is they don't know how to write, calculated fields quickly, then yes. But if the problem is they're asking questions which are fundamentally not the right questions to be asking, or the question is, or the problem that comes from the fact that they though we've you've given them a self-service environment and a semantic glare, they're asking questions, they're not bounded by that, then the answer is no. Because there's not about productivity, and it's not about technical barriers. And And I think the same thing is true for data literacy and for single source of truth. It's about is AI actually if you think about AI being productivity or moving artificial technical barriers and you're now left with the human elements of this problem, that's the root cause that you gotta think about and where clarity, I would say, matters most. It's like how do you help the humans and their and their workflow? That's my that would my take on this. I think it's for summarizing what we're all saying actually and putting it together in a really succinct way. Hundred percent. Yeah. I wanna I wanna move on because I I think there's so much here. There's I hope as we're going through this people as you're listening, you're you're getting hope we're helping you build a mental model, hearing tangible examples, hearing tools, helping you wrestle with this, and getting to ultimately what fundamentally matters, which is, like, the right solution to the right problem and using technology there carefully. What I what I think if there's one thing I would well, I think you can hear from the panel, it's that just throwing AI like, pouring AI into your work into your organization and hoping it's gonna return a good outcome is not a guarantee, but carefully applying it, even boldly applying it to the right problems, that can be very ambitious, is the right thing to do. But, like, it's the it's the work to be done mentality, which is the most important way to distinguish this. I guess I want to think a bit more about, like, the underlying capabilities. Jessica, you mentioned before about how you think that we need to get more into the soft skills that hard skills are gonna disappear in two years. I wanted to think about, like, capability matching and how you're thinking about, helping your team or what you're hiring for. Is there anything that you're that you think, that you've been thinking about when it comes to team mix, which could be quite helpful to the audience to hear about? Like, are you already changing your hiring expectations of technical skills and your as you're looking out into the market? Is there anything you're doing about skill mix and training, which just, like, prepare for AI or start thinking about it better? So I think as we're quite a small team, and we've gone through some restructuring lately, so it's kind of like rethinking what what data at NUTS looks like. It's kind of how do we make sure everybody has the most well rounded skill set because I think very, very specialized roles are probably it depends on the like, bigger companies obviously can have more specialized roles. But if you're like working in a kind of start up environment and you like the smaller companies, you have to be more generalist in your skill set. So you can't just, know how to do do data engineering. You kind of also have to understand the data modeling and what problem, like, the business has. Like, do you know how the business makes money? Do you know what data impacts how the business is making money? So they're, like, making sure everybody's more well rounded in terms of all their skill sets. So I think within the data engineering and analytics engineering space, the kind of projects we've been working on have really lent themselves well for people to, like, cross pollinate skill sets. Because when you're a small team and this person's on holiday, someone has to do the work that needs to, like, happen. So you end up learning a slightly new skill. You may not be as quick as the person who specializes in it, but, now with AI, you can ask a question like, how would I do this? And hopefully it gives you a a guidance in the right direction. So, yeah, it's just, like, really helping people develop multiple skills, not just, like, now you need to be a a really good specialist. Yes. You can be a specialist, but you still have to have an understanding of what else happens or what else feeds into it. And then, like, I'm not even sure what data roles will exist in the future. I mean, we've had analytics engineers come in that wasn't a, like, term that was used. It's kind of a mixture of skill sets. And now what are we gonna have? Like, five years from now, is it just gonna be you're a data person or you're an AI specialist or you're just a problem solver or are we all gonna be, like, product people in the future? Because product people know, like, how to get to the real root cause of a problem. They ask those, like, five whys and all those questions to really understand the problem and how we solve it, and then bringing a whole bunch of people together to now solve that problem. So it's interesting. I really have an answer, I guess. It's kind of I mean, I think it's great. I love the I love the way the way you think about it, like, broadening the capability set, being less less worried about specialisms, being thinking about the fundamentals of the what what you're saying there is, like, there are some fundamental skills that they think we're grateful, critical thinking, the problem solving mentality, and they're what will ultimately remain. But that's a data person title. Who knows? Like, the, unfortunately, we're wrong. The role inflation, there's a whole another webinar about role titles and data and how and how how really important they are. But, like, the other key is, like, there's still gonna be a role here, and the the role links back to some degree of technical understanding, business problem understanding, critical thinking, and problem solving. Yeah. Then everything else should be, is about capability training and learning, which can AI can massively help with. Sorry. I'm just playing back what you said. But That's okay. That's pretty helpful. Communication is gonna be more and more important now because when you have to communicate effectively with AI, but also then with your stakeholders to understand what are the actual problems that you're trying to help them solve. Because I think data people at the core of it are just problem solvers, and SQL at the moment is the tool that a lot of us are using to solve those problems, or Python is a tool we're using to solve the problem. And maybe now that tool is gonna change in the future, but we're still solving problems. So it's just in a a different way maybe in the like, because the the SQL part and the the Python part is maybe a little bit more automated in the future. But Yeah. We'll need to understand the problem and the space that you're working in to effectively use the tools that you have at your disposal. I love that. Yeah. Mikhail, welcome back. Thanks for joining us again. Michael, we're just talking about capability and how we think capability skills are gonna change. Any thoughts, Michael and Mikhail, about the skill mix and the way you think the role change into the spikes? Yeah. I mean, I can couldn't have said it better than than what you guys have said. I think my you know, to be even more succinct, it doesn't change much as in you want people that are curious, are passionate about developing their craft. They care about what the business is actually doing, and they're, like, a good person to work with. That's kind of it. So, yes, like, you know, some base level expectations around, like, stat stuff or the ability to code, but it is those fundamental things that will get you get you far. And so, I don't think it changes that much for us. I like the point on curiosity. I think curiosity is such a important part is for people in data because that's how you kind of solve problems. Like, what does this mean? What does this do? What happens if I do this? Like, that inherent curiosity is what leads you to be a better problem solver. So Yeah. Yeah. I think you can you can kinda think about the data roles as falling on the are you kind of closer to systems like a data platform engineer or data engineer, or are you closer to business impact and decisions like data sciences or an analyst? And depending on where you fall on that, I think you you might have to ask yourself some pretty serious questions if you do certain work the next few years. I think if you were a data analyst who mainly just pulls that request and does dashboard based on demands, as Jessica said, you should really think about how you could become a a business problem solver or more of a product person or at least have very close intuition about customers or something like that. And I think also the elephant in the room a little bit that's that I think you in order to, Jess, the case, the analytics engineer. That that role where you kind of sit in the middle and you build these scalable Okay. Data models often in in DBT, that that role feels to me like that's kinda look very different in a few years because it it I think that the systems and stitching together stuff at the data platform, that's that's a complex problem that's gonna exist. But that thing in the middle, I'm very curious how that's gonna I might you don't know. Yeah. And I think get first. One of the other thing well, I think yeah. I mean, we're already seeing the role become more, like, engineering. Like, and I think you you mentioned, like, the spikiness, Ollie. I think that's really important that we even in the same role, I, like, data science, that people can flourish even but looking very different. So, like, you can have spiky in terms of, like, the work you do, how you present it, what your skill set is because you wanna create an environment where you do have a very, like, extra genius team. Right? And I think that is gonna be really important that we don't over index for, like yeah. Just does, like, AI stuff, and we all just become the same because that that will not work. I I want so we're gonna move on to some questions very shortly. So please, if in the audience, you have any questions, please start adding it in to the chat. But while we do that, I was I just think the point about what we're discussing here is ultimately that there are technical barriers right now and there are people barriers, and people are always gonna remain the communication piece of this, the problem solving piece of this, the ability to be critical thought leaders to be able to understand and wrestle with the complexity of the business and make it clear to people, that that's a universal constant that we're talking about here. Those are the skills that are never gonna not be needed. Whereas the technical skills, we're kind of all agreeing, is probably gonna will evolve the most from AI potentially. And the and I think that's what you're we're we're we're talking about here a bit of. Like, we can see a lot of change in terms of the skill mix that we need, whether we need need certain roles, which are very technically for orientated, but we we all recognize the value is gonna come from the human side of the the job, and that's the differentiated that's gonna increasingly be the differentiator. They it already is the differentiator. If you look at the best most best following teams that we know as customers account, they they are already the best of the people side of things, the communication, the the clarity that they give to the business, and the the technical skills fall out of that anyway. It's we very rarely see the best organizations being ridiculously technical but not having human skills. It's always Just just on that. If you're if you're, like, an engineer technical person that wants to sit in a dark room, not speak to anyone, and now the work you could do like, you might be able to now do a lot, lot more by, like, as Mikkel said, like, piecing things together. I do think there is a place for that role. And if they don't wanna be, you know, great on the comm side and just build these, like, crazy systems and be, you know, a thousand x, like, I do wanna make room for those people. I think that they could be very valuable. I I think there's also a very important factor to almost, like, segment this discussion by, which is digital native, which I think not on the high street and and Bumble will probably fall into, and then the old enterprise companies. And we we started to work more with that that latter group, and that is just a different set of problems that they face and a completely different set of systems and tools and everything. So maybe they're gonna converge, but I wouldn't use BiOptics as a bit of a delay or at least a different set of problems that will exist within the price. Yeah. Yeah. I know. I I'm sorry. Sorry. I was just about to say all the banks are still on mainframes. So there's a there's a strong need for some technical skills, and theirs are never gonna go away. Yeah. Yeah. Yeah. And there's the digital native, and then I'll be super interested in, like, the startups coming up now, like, what AI native looks like because I'm sure, like, the last thing we wanna do is become dinosaurs ourselves. So we are, like, desperately trying to not become those those folks. Yeah. I I am what I always what I if I put my product brain on, the thing I'm expecting that we will realize very soon is that as a eye gets more powerful, the biggest limiting rate limiting rate limiting step is gonna be the ability for our brains and for humans to understand what's going on. That interface between the AI and the human is gonna be a rate limiting step, which is true now. Like, the ability for humans to humans to communicate effectively understand what's going on is already the power the the block we're discussing a lot, the communication clarity piece. When an AI the way the AI communicates to the human what's going on, that's also gonna be it's always gonna be the problem that interface to us. We're we're the we're maybe the weakest link in the whole in all of this, we accelerate forwards. We should stop. There's loads of questions that I wanna I wanna come to. So, I've got at least three I wanna pick out if I can. One of the things that's really important, I think is driving a lot of the FOMO right now, which is I'm gonna paraphrase this. Sorry for the person who wrote it to me. But it basically says my CEO is AI crazy. The FOMO is real. How do I deal with this? I'm getting this pressure to sort of I guess what they're saying is the pressure just to to just drive AI with that just because it needs to be done and and everything I'm reading makes me feel I'm behind. How do you deal with that? Has anyone got any suggestions about that or or come across that? Maybe maybe we'll say it's in different previous jobs rather than your current one to say you don't get fired. But how would you deal with that? That's a very understandable, I mean, honest question. I mean, I think, there is probably something like, I think this is what vendors and a lot of people, like, recognize this, and so they put AI buttons on things where it's not really AI. We put the purple sparkle somewhere. And I think if that's what you need to do, like, you know, it's just so someone can, like, tick a box. The most important that if you start to, like, merge off the path of delivering value, you're gonna have a bad time. So, yeah, whatever you can, like, make sure you're doing the right things and call it AI where appropriate, I think that's the right thing to do, but just don't don't do something for the sake of it. That sounds a wise advice. Anyone how do you deal with the pressure that they how do you deal with that pressure? What's about to that? Because I think that that is the that is the right thing to do, Michael, but I guess it's the politics of dealing with a with a with a executive team who just wants to press the button so they can feel better. Yeah. I think it ties back to the business's strategy. Like, if they have a good understanding of the business's strategy and you know how AI is gonna fit into that strategy, then if they have to wait, they have to wait. Like, they they have buy in with the strategy. Or maybe there's a really small third party tool that has AI that just satisfies, oh, we're using AI to do this. Because that's all they want. They wanna be able to have their leadership dinners with fellow CEOs and just be like, yeah. We use AI to do this. It's it's just a talking point because everyone's talking about it. So they don't wanna feel left out just like we all feel left out if people are talking about all these new skills that they're learning, and you're like, oh, I didn't have time to for that because I was dealing with a pipeline breaking or something like that. So I didn't have time to learn a new skill. It's that it is FOMO. Yeah. I think that, like, we sort talked about it at the beginning. Right? Like, the get stuff done is the easiest place to start. So, like, if you wanna buy everyone WinSurf or CursorPro or whatever, like, do that. Right? And we use AI to help speed up our development processes, and we're shipping ten times faster or whatever. Like, there's a lot of ways that you can just phrase something that you're already doing in a different way that kind of ticks that box. So you probably are people are using AI to summarize things or to bring lots of information together or take meeting notes. I mean, that's a really good use case. Already, like, oh, we use AI to take meeting notes because that means that one person doesn't have to sit and do it. AI is doing it, and we can all concentrate on the meeting. So I'm gonna we're running out of time, so I'm gonna cut in and just ask one more question, I think. And then just for this those of you who we haven't got the answer to the question, we will get back to you offline and give you an answer face to face. So I wanna make sure everyone gets their questions answered. So don't worry if you don't get your questions mentioned now. I have been a very bad management of time here, so apologies. But it's been a really good conversation. The the last question, maybe it's a one just one for everyone here, kind of round robin. What is, the best execution of AI you've seen so far, not necessarily in your organization, just just externally, both in the data space box and other domains? Maybe that's just a kind of a way to wrap up. I can actually start here. I I think actually a bit like Mikhail mentioned, I've seen some really very tactical uses of AI instead of dealing with, like, formulas just like saying if they need which one to do, and the AI will then build the formula in a particular box, in a particular part of that UI. There's a very elegant solution. It's a very tactic very technical, very niche problem where the where the where it's a very safe failure if it gets it wrong because you can still read the thing. I think that that's that's happening a lot in different tools, and I think it's very powerful use case. Anyone else wanna go for just quickly where they've seen a really good execution of AI where actually it really does work? I mean, Jessica, I know you mentioned turning dogs into people, but anything else? I don't think I have one that I can think of just off the top of my head. I did see a video on LinkedIn yesterday that I saved because it was interesting. The guy had used AI to kind of map out, a certain concept within the business. It it kind of he was going along the lines of eventually applying it to a map. And, I don't know if anybody's seen my talks previously on maps. I was, like, very interested to kind of see how he was using AI to help him do that initial plot of the business problem he was trying to solve on a map. I was like, oh, that's a really cool use of it and the different tools he was using. I was like, I've never seen this before. I'm definitely not keeping up my skill set. So that was a a nice kind of application that I saw very recently, but they may be really cool things that I just haven't seen because no one posted it on LinkedIn. So The place where all AI use cases go to to flourish. Mikael and Michael, maybe you could, give us quickly, like, what what are your your your, like, your most inspiring example you've seen so far? Yeah. I mean, I think the the example I mentioned in the beginning with this debugging example in in in sync was pretty exciting because I just know how painful that is. From a personal angle, just coding, I used to code a bit web design. And a month ago, we made this website called data benchmarks dot com, which is just this fun little project where you can benchmark your data team and size and so on. And using Cursor, I could build that with, you know, some to very little understanding about web development in in two days. And just that experience has made it it was just so much fun. So that that I think just using cursor for that for me was just mind blowing. That's awesome. Thank you. That's a good example. We're actually gonna share the link to that, I think, in our show notes actually as well. We didn't get to get to it, but we would talk about team structure early, and I think it's a really good example of benchmarking there. And then, Michael, maybe you can wrap up what's your you want I would say you're a very good early adopter. So what can you what can you tell us excited you most? Ollie gives me a lot of grief for this. So I think there are two two tools that I would plug. Meeting notes like Granola is just unbelievable how good it is at at doing this stuff, and its vision about being, like, the center of where work happens is super cool, plus a London company. So, big fan, I think. And then the all these, like, products prototyping tools like the lovable stuff that you like, you could take that and then put it into cursor, replit. They are, I think, really game changing. And, again, when it comes to this world of velocity, very excited about the things you can do. Yeah. So Thank you. That's awesome. I love that. I think that's really helpful. So I hope thank you so much for watching. Thank you. That we sort of ran out of time to get some more questions. It's been a really great conversation. Thank you so much, Jessica. Thank you, Michael. Thank you, Mikael, for all your input and being honesty about where you are with the journey. I hope it's been useful for you. As I said, our goal today was really to help you sort of get beyond the hype and start to build a bit more of a mental model with peers, thinking through these challenges, thinking about what could be useful here. I hope that's been a helpful, talking point. As you as you if you have any more questions or follow-up, please get in touch. We're happy to help more, and we'll make sure we give you the show notes on the back of this given we've had lots of different puzzle wisdom, I think, as we've gone through the session. So we'll make sure we wrap that up. But thank you all for tuning in, and thank you much to the panel. It's been really fun. Thanks, Ali.