Welcome to today's session from service to strategy, transforming your data team's impact with Dan Redgate, who's the product analytics lead at Too Good to Go. Before we kick off this conversation, just a few things about how this session will work. Dan and I both prefer to have kind of conversational sessions, so feel free to interrupt us with questions in the chat. I'll be keeping an eye on that. So it really yeah. Questions really strongly encouraged as we go. The session is recorded, so if you have to drop off early, that's fine. You'll get an email with the recording. And, yeah, that's kind of all the admin. But like I said, any questions you just wanna pop in and say hello as well, feel free to use that chat window on the right hand side. Okay. With that, I think we'll kick things off. Dan, over to you. Maybe start with, just a little bit about yourself and Too Good To Go, if people aren't familiar with you guys. Sure. So, let's start off with a little bit about Too Good To Go. So we abbreviate this as TGGG in the company. And Too Good To Go is basically a marketplace which sells foods that was destined for the bin. So think about your local bakery. At the end of the day, they have a ton of items, that are left over. In the past, they used to just throw this away. We provide, a marketplace, an app on iOS and Android where people can purchase, this these items, they're left over at the end of the day. And so currently, we have over a hundred and fifty thousand active stores on a monthly base basis. We have over twenty million active users, and, we're active in, nineteen countries. We just added two very recently. Encourage everyone to download it if you haven't already. It's very good. I should have done that as well. Yeah. Too good to go. Kill me for that one. But, yeah, it's a great application, great way to save a bit of money, great way to prevent food waste. As you can see, this is something we're very excited about in the background. In case you can't read, it says fight food waste. So a little bit about me. I have a rather unusual background for product analytics. I actually started my career as a trader, based out of Japan and Singapore, and there I was managing trading desks, covering Asia and the Middle East. And, eventually, I kind of got fed up of doing something which I didn't see as having any meaning, any value, and, briefly, started up a tech company, which is working in legal tech. Again, had the same feeling that I wasn't doing something with any real meaning or impact, and, saw this great opportunity to work with Too Good To Go in product analytics, also helping to shape the strategy in the company. And I've been there for the last two years. So, I have a small team of, five plus one embedded product analysts. One of those analysts is currently more residing under one of our, subsidiaries, a company we just took over. And so the embedded product analysts are handling different stages of our acquisition pipeline, So from acquiring stores to activating stores and optimizing the stores which are already on our platform. And then we have another couple of teams which are handling key accounts, building new features, and handling, problems with our current setup or requests that we get in from key accounts. I handle everything partner related, so I don't look at anything actually on the consumer app, only on the store app and and just look at our partner acquisition and, retention. So I was asked to briefly walk you through the stack that we have at the moment. I think everyone loves seeing what everyone else is doing in the market. So here's a simple view of what we see from the data analytics team perspective. So we work primarily in Redshift, on top of which we have, DBT models that, clean the data, build some metrics, etcetera. And Amazon Redshift is ingesting data from a lot of different tools, but I would say primarily Salesforce, our production database, Amplitude, which is our events management system, and then a bunch of other sources. So this data is then used to populate, charts in Looker, and also we crunch this data, of course, in counts. And, I've highlighted in here in red, so the three, you know, output layers, if you like, that we have to Looker count and Amplitude are the main places that we're looking to share information with our, with our team and with the broader, organization. And how do we distinguish use of Looker and Count? For us, Looker is for, robust charts that we want to share with the wider organization. At the moment, you know, there are more controls that are in place, to ensure only the right people will see, you know, a chart. And it's a lot easier to, ensure that things are not, messed up on on a large scale when you're working with Looker. Count is more for exploratory data analysis, sharing, insights with a smaller team and, you know, ideating, I guess, for the analytics team. Nice. Yeah. I think it's really good context of people to see if data teams are structured very differently in different stacks, and and it does kind of impact, the problems and and solutions you come up with. So I think it's helpful. But one thing I wanted to ask before we really get into the heart of this, this kind of question of data team value, I've been hearing about it a lot. And I think everyone kind of has their own story of why it matters to them. And I just wanted to like, before we dive in, what was it that made you start thinking about this topic of data team value? Yeah. So with my background being a trader, I had a very easy KPI to understand, which was make money. And when I transitioned over to analytics, I saw that there was a a lack of clarity often in teams about, you know, what people should be working on, what should be prioritized, and how should we assess if it's something that is important or not. And, for me, the simple, solution here is just to think about what generates value. And value can, of course, be interpreted in in interpreted in many different ways. But I think, generally speaking, you can come to some kind of idea as to how much value is generated by a task. So the problem, I guess, that I looked to solve when joining, right, so this transition from the status quo to a company where value is driving analytics, was that product analysts typically had a long list of tasks that they had to work on. PMs will basically say, I need to see this. I need to do this. I need to measure this. And the analysts would go away, and they would do conduct their analysis and get some actionable insights if you're lucky. And that was basically it. And I think analysts in this kind of, setup can get quite bored, lack motivation, and, honestly, you're not really getting the most value from the analyst here. So the transition I wanted to, make here was from these analysts being task monkeys, if you like, to being value obsessed strategists. This is not an easy transition to make, but the idea here is that we're getting the analyst to think in a different way, to connect with their team in different way, and to start to connect even with, parties outside of their team. As I mentioned earlier, my analysts are embedded analysts. So this can be a a big problem, I think, in a lot of organizations. The challenge, of course, is how do you actually make this transition? How do you transition your your analysts from being task monkeys to value obsessed strategists? And I handle this execution in four ways, or I should say I'm I'm handling this execution in four ways. It's never a task that you finish, but we have, you know, reaped some rewards from the transition so far. So I think the first point here and the easiest, transition to make is to redefine the relationship between analysts and PMs as a partnership, not a boss and, you know, direct reports kind of, structure. I see in a lot of places, you know, the product manager is the boss and the analyst works for them. And this means that the analyst maybe doesn't feel empowered to question things that the product manager is doing. And as such, they're they're not gonna get out of their comfort zone. They're not gonna work a lot of projects which which could potentially create a lot of value within the team. And you're also not gonna get that input on where they see, you know, strategic value. I think analysts generally do have an opinion on a lot of things. And, ideally, they don't need to be asked. They should be empowered to ask these questions. And this transition is not so straightforward, but I look to handle this by first having the conversation with the analyst and talking about the transition we wanted to have, and then started liaising with the product team. So I talked to the vice president of product and said, look. This is my vision. This is what I'd like to empower. And, of course, he was very excited about this idea. From his perspective, he wants great ideas. Right? And he wants ideas that are gonna create value, and, of course, he wants insights from the data team. And he could could see the value of transitioning to this, you know, partnership structure. So instantly onboard there. And I think talking to the product managers as well, to be honest, they're also quite excited about the idea of someone else, you know, generating more inputs, thinking outside of the box a little bit more. And, you know, even if the analyst has to put on on hold some of the tasks that they've been given, if they're coming back with great ideas and if they're coming back even pushing back on certain projects saying, look, I don't see this as creating a lot of value, then the product manager also builds a lot more confidence in what they're doing, and they're able to prioritize things better. So it's really a win win all across the organization. Did you find any anyone it didn't tell you had very much pushback on this, but did you did you have anyone less enthused about this idea, or was it pretty universally exciting for everyone? For the most part, I would say eighty percent, of the PMs are very much onboard. The other twenty percent, not so. And I think it's always a little bit more challenging maybe if your analyst is not quite as strong. Right? If you have a strong analyst, of course, they're often happy to see them as a peer. But if the analyst is weaker, I think it's much easier to slip into that, you know, hierarchical mentality. And, honestly, that's something that I'm I'm still struggling with a little bit. You know, what you need to do is work on upskilling the analyst, I think, so that they feel empowered to, you know, work alongside the PM as opposed to for the PM. See if that's quick, yeah, quick question here, if you can see it, from, from Jessica asking if there are any cases where they say they're on board, but maybe their actions prove otherwise. Yes. In a way. Actually, I would say, though, it's more a problem on the analyst side than the, product manager side. The analyst likes the idea, but they struggle handling their task load and putting that on pause. And this has also been quite a challenge. We've tried several things to handle this. One easy thing to do is to set aside time every week for analysts to basically just do their own thing. Right? Work outside of the scope of their team. And the other thing I did to try to manage this, which works to a certain extent, was to basically put a framework in place for the analysts or or train the analysts to to use a framework which enables them to field tasks coming from the PM better. A lot of companies have this structure where, basically, the PM can just create a ticket, and that goes to the analyst, and the analyst now has to handle this. And the analyst doesn't get say as to whether that ticket is created or not, and it's hard for them maybe to put a lot of these tickets in the back burner. What I encourage is that no ticket is made until this the analyst has had a discussion with the PM. So, of course, the PM needs to have buy in on this. And then the analyst really needs to drive the conversation, ensuring that firstly, you know, they understand the task at hand. You know, what is the PM really looking for, not just what are they specifically asking for. And then the analyst is much able to much better able to assess the value of working on this particular ticket. And they then discuss the prioritization with the PM, and only then when they they deem it is something that should be worked on in the in the short term, will it actually be, you know, what will a ticket actually be created? So this helps, like, reduce the workload on analysts and helps, you know, free up some time for them to work on, more important, I would say, more valuable items. I don't know if that answers your question to a certain extent, Jessica. There's also something, in the image kind of coming in the the one to one section that you'd mentioned in a we were talking about this before for maybe an analyst who's new to this, some some different ways that you kind of help develop them or develop this kind of way of thinking in in kind of a one on one environment. Can you talk through some of some of those? Sorry. Sorry. Can you just repeat that? Yeah. Of course. Just a part of this, I guess, on the getting the analyst comfortable with this new way of thinking, challenging. You mentioned a few tactics that you use in kind of a one on one scenario with the analyst to help encourage this kind of new way of thinking, I guess. Can you talk through some of those tactics that you use in the, like, one on one setting? So this is a productive one on one's point, right, that we have here on the slides. So I think in a lot of cases, the analysts, like you say, Jessica, that are not necessarily, or that are maybe onboards but don't know how to be onboards. I think in some cases, maybe the analyst is just a bit introverted, and they struggle with challenging a PM. They struggle with putting forward their ideas. So what I've also done is create a space in one to ones for them to share ideas that they have. Now I keep asking them, you know, what are you thinking of this week? You know, what ideas have you come up with? What interesting insights have you have you found? And we can have a bit of a discussion on that topic. Right? And, ideally, from this discussion, you can empower them, and then they can take that back to the team. But if you see even that's not working, of course, you know, you can bring up these ideas, yourself with the the PM. And you can also just keep encouraging them to share and wait for them to, you know, feel that they have the confidence to share. Maybe they just want a bigger idea. Maybe they want to, iterate over some more ideas before they're ready to share. But I think this is one way to to handle the kind of more introverted analysts. So next, I would say, I think an easy thing, to, implement this change is to just have analytics team brainstorming. So the analysts often feel very isolated in their respective teams. They have a team. They may be great friends with everyone on the team, but they're the only analysts. And I think, you know, creating a bit of a community is is really important for, again, empowering the analysts, to think about what they can do to, you know, improve the organization, to create value at scale. So we have these dedicated sessions where all the analysts sit down together. They're not bound by any constraints in their respective teams, in their silos, and they're just encouraged to think across, in my case, the whole the partner side of the company and think about what they can do to drive value. And I don't even restrict this to strictly analytics topics. I also like them to think about operational topics. Right? Are there operational efficiencies that we could help with? Right? Have you seen an issue with the structure that's being implemented by the operations team? And often, that is the case. Often, you know, we're not getting the value we should be getting because analysts are feeling they're limited to their own area or they haven't even tried looking outside of the box. And, yeah, there generally is a lot of low hanging fruit, I think, just by looking outside of of your direct fields, especially in organization that you're good to go where analytics is a fairly new function. Yep. Yeah. Just reframing that question that's gonna be just the most valuable thing. Right? Instead of just being yeah. What what can we build here? It's kind of like, well, what's what's the best way, as you say, to to drive this value? Then you get a whole different type of, solution set. Yeah. Yeah. And the final point here is, value obsession. So just to really drive everything home, you know, we've made a lot of changes here. The way everyone communicates is maybe different now. But to ensure that we really, you know, drive in the right direction, you keep asking that question. What is the value generated by this? Right? You're working on this task. What is the value generated? Why is it important? And I think I'm never gonna stop, asking that question. Everyone keeps forgetting, that it's not about doing some interesting work. It's not about just doing functional work. It's about driving value. Nice. So moving on to the the second stage here, and this is more about the type of thinking I've been trying to, develop in the analytics team. It's a bit nascent at the moment, but it's something I've seen, other people, employ with, great success. It's this simple concept of always, zooming out to get perspective. Analysts in particular, you know, often have this very specific personality type, which means they like to get lost in the jungle. They love looking at the data, and they love love looking at everything that is easily accessible and at their level, but they often forget about how what they're doing fits into the bigger picture. They often forget to zoom out and get that bird's eye view, on what they're doing and how it connects. So I'm just switching tabs a little of the time here. That's right. So there are several examples I have of of this idea of zooming out. It's not really a very simple concept. It's a bit abstract, but, hopefully, these examples help, you know, fill you in on how I'm approaching, you know, to to, employ this, this concept. So first of all, let me give give you a bit of context here. We were trying to solve, as as if you can, the issue of churn, or figure out how we can, you know, boost the retention of stores, on the too good to go platform. This is obviously a comp a a challenge that every company faces. I think one of the really big challenges, on some platforms is just where you can get data to really accurately predict churn and how you can understand the reasons that you're losing, you know, people on your platform and, okay, stores in your platform. And, when you get those reasons, you can get those reasons from, doing a survey. But when you do a survey, often you can't get an idea of the real scale of these different reasons. Alright? So how many stores just find that they've managed to optimize their operations now, and now they don't have enough supply to supply with too good to go? And, you know, how many stores, have not really left us, but they were just in holiday and they didn't use the, you know, the holidays, scheduling tool. So really getting an idea of scale and in some cases, digging a little bit deeper, you know, to understand, nuances of different, reasons that stores might stop supplying the application. So how does zooming out fit into this? We had all this data available. And looking at the data we had available, it was clear that there weren't any great ways to understand, you know, why stores were leaving us. And looking at surveys, we didn't have enough clarity on scale. So we needed to figure out a new way to get data. Right? We needed to figure out how to fill in this data gap. So instead of just trying to figure out things that we could do via product and staying stuck, you know, in the analytics and the product level, We instead looked outside of this at what the operations teams were doing. The operations team, or I should say the sales team, had frequent touch points with stores which had stopped supplying. And, you know, imagine you have a hundred and fifty people, two hundred people, which are actually dedicated to trying to make sure that people don't leave us. That data they're they're effectively collecting is going nowhere. And we figured we could leverage that data, and maybe with LLMs, we'd be able to somehow process that data from these calls. So there was quite a process behind this, but in the end of the day, we found that we the best way to to get this information and process information was just get the sales team to, or the retention team, I should say, to write a short sentence about why a store was looking to leave us when they called them. And, I'm not gonna go into too much detail here, but, you know, the way you phrase the questions here and, of course, the way you have this conversation is really important. That's something, obviously, from an analytics analytics, perspective you need to handle very carefully. But without going to those details, we're basically able to process, this information from the stores that stopped supplying from this, our sales team, post it with an LLM, and gets, a a list of categories and account of stores who are list leaving us for each of the specified categories. Beyond this, let's say one of the categories was competition, we are able to go into a bit more depth because the the the team was saying, this store stopped supplying because competition was offering a lower price. So now we can actually go beyond the categories and, you know, use the LLM again, go for cycle through all these, stores which are leaving us because of competition and actually find out why stores were leaving us because of competition. So we could ascertain that, okay, In these particular countries, stores were leaving us to competition because they were offering a lower price. So we are able to generate a lot of value from this project, and we're still generating a lot of value from this project. And and it was, you know, stimulated just by this idea that we could look outside of the box of product analytics and even the UX sphere and collaborate with the other teams to fill in the missing data. And this is getting easier and easier and much more interesting, I think, with LLMs where you can start to process calls or content from calls, and some certainly something we're gonna be looking looking to use a lot in the future. Yeah. I I really like this example for a few reasons. I think one is, you know, obviously, what this requires, I think, your own knowledge, like, as a data leader, how the rest of the organization is working. What does this operation look like? What can you influence, and what can't you influence? And I think it's this awareness that you kind of touched on in the first section of, like, your domain is bigger than you think it is. Like, you don't have to stay confined into reports and dashboards. You could actually influence, yeah, the products. You can influence what a growth person is asking on the phone. I think assuming that your sphere of influence is much larger than it is kind of allows you to have such a bigger impact on all of this. I think that's something I I hear some pushback sometimes from data leaders who kind of on this idea of, like, adding more value, this kind of thing, and said, well, I can't I can't control it. Like, ultimately, I can't control product metrics as product team. I can't control growth metrics as growth team. I can just ask. But I think what you're kind of saying is that you do have to assume some responsibility here. You can impact those things. You can do that. I think that is, like, a very essential first step to be able to do this. So I think it's a really good example of that. Exactly. And I think one thing which really helped this project along was that from the beginning, we were careful to present this as being not just a value proposition for our team, but a win for the, you know, the team which is actually reaching out to talking to these stores. They actually wanted a voice. You know, they, in the past, had been sharing reasons that stores were leaving us, but, of course, we didn't get any clear data. Right? So this actually now gave them an opportunity just to, tell us why stores are leaving. Right? Because they feel this is a wasted opportunity, and it was. And but, of course, if you go don't go out there and ask them and don't look to facilitate this, then you you'd never cash in on this, this opportunity. That's a really good point. Yeah. Sharing sharing the win, I think, is really key. Nice. Well, I I'm aware we're kinda quickly running out of time. So I might say, I know you have a few more examples, but maybe we, maybe go ahead and jump to the next section. Yeah. Of trust. Yep. That sounds good. Yep. Excited about trusting. Yeah. So the topic of trust is obviously a huge topic. There are so many books written in this topic. I think everyone has a different set of pillars that they look at, but I'm just gonna share some of the the the three main pillars that I look at to build trust in the organization. Of course, number one is value generation. Again, my background in trading, I'm value obsessed. I want my team to be value obsessed. But I also think that organizations appreciate this, focus on value. They want to see the value that you're creating. And then when you're creating value, you get a voice. They will begin to trust you. In fact, they will trust you a lot, even cutting through a lot of the noise that you're creating. And the other two items here are communication and oh, this is cut off here, communication and, alignment and unification. So communication is an obvious piece. I think I'm gonna go into a little bit more detail on this in the next slide. But I think it's something a topic that's really challenging for data and analytics teams, something that we're all still trying to figure out. And the alignment and unification piece, I think is, again, maybe not so obvious, actually. And I think it's an area that's maybe often overlooked. New, heads of data and, heads of, IT, etcetera, often come in and look to, bring in new tools to transform the way an organization works. But I think just figuring out how you can optimize the tools that you're using, how you can unify, you know, processes and ensure that you're speaking the same language, you know, can already generate a lot of value, if not a lot more value than bringing in that fancy new, tool that, the new heads is excited about, unless it's count, of course. Of course. So short and simple. That's how we like it. So on the communication piece, what I found works, and I'd love to hear your thoughts on this actually, is just to start super early communicating what you're working on. We often see analysts start work on, some deep ML project, which they see as creating a lot of value. And maybe it could create a lot of value. But if you haven't started that communication early and if you haven't looked at what exactly you want to use this analysis for, often you'll find that maybe the stakeholders are not aligned with what you're doing, or they're not willing to spend the time to integrate your ideas, with the the the OKRs that they have set for themselves. So always get the earliest possible start. And if you get people excited at the beginning, this also helps drive your team forwards, get your team more excited, and help them deliver things on time. The second point for me is, just keeping it in person and direct. I know analysts, again, often being introverts, I'm kind of in between introvert and extrovert, but we like to just send off a message and hope everyone reads it and hope everything goes okay. But I find it critically important to have particularly early stage conversations in person or, directly with stakeholders. And that includes conversations on dashboards, for example, that you've created. If you create, a long complicated presentation about the dashboard, how it creates value, how it should be used, etcetera, some people will read it. Most people will forget what you've written in the presentation. But if you sit down with the the really key high level stakeholders and you explain, you know, chart by chart what you're trying to communicate, why you're trying to communicate, then you'll find a lot more value can be derived from what you're sharing. And you can iron out a lot of the, the problems and, you know, challenges of maybe interpreting the data you're sharing, or potentially just a misalignment of of definitions that you have in your company, which can put stakeholders off. And final piece, I have to have it here. It's one of the two good to go, you know, core values, which is to keep it simple. We love to overcomplicate things in the analytics team, and I think, you know, you can't go by this, this slide without talking about the importance of keeping it simple. Again, so often, I see my analysts looking to present, for example, a a growth model that they built and try to get stakeholders on board, and the stakeholders just pulling out their hair trying to understand what the analyst is talk is talking about. I think analysts need to be guided a lot in this respect, and you need to help them understand, you know, that the most important thing from a presentation is what is taken away. And in order to ensure that they take away the right thing, you need to really drill it home. So, you know, find that that key value piece that you see is coming from this research. Start with it, elaborate on it in the middle of the presentation, and then repeat it again at the end. And you can only do that if you're really taking out a lot of the details about how you implemented this complicated model, how you arrived at the current structure. Yeah. I think I think these are really key and, like, very again, this is a very simple and digestible thing for people to to take away, but really powerful. And I wanna just, add on to the start early IDX, I think, from our customers. I think this is one thing that they're always so surprised to see have an impact, like communicating early. Obviously, in a lot of tools and stuff, you're very you're forced to work very waterfall. Like, you might get a request, and then you've gotta go away and build something before you can even show someone. So bringing someone into your process and your thinking as soon as possible. I think even thinking about it from an efficiency perspective, we hear this. Like, people are so surprised at how much more efficient they are because you find out sooner if you're doing the right thing. Like, if there's so much wasted work because you you haven't done that check-in early. So, you know, as you say, you've built the LLL model maybe on the wrong set of information or, like, you know, you're gonna have a fraction of the impact that you could have had if you just stopped what you were doing and had, you know, a different conversation to start with. So I think if you're kind of sitting here thinking that communication or something is is difficult to quantify or difficult to translate to to value or something, it obviously does play into trust. But I think there is this efficiency piece in here that is very quantifiable. Like, you are going the best thing you can do is not do the wrong thing. That's the most kind of efficient use of your time, and that getting into this practice is starting as early as possible is, I think, one of the best things I've seen people do consistently. Yeah. Like, it even when something seems so obvious, right, you'll share share a really simple chart. You're like, this cannot possibly be misinterpreted, but they'll see the same chart on Looker and it looks different. Yeah. Maybe the Looker chart's wrong, but they just see it and they think, I don't wanna go through this now. Yeah. But if you'd sat down with them, you know, you could just explain to them, okay. Well, the way we're calculating retention is different. You know, we're not calculating calendar month basis. We're calculating from when. You know, stores first and last, for example. And and you can iron out all those, you know, reservations that someone might have. Again, instantly kind of building that connection, building that trust with the team. Yeah. Definitely. So the final, maybe slightly heavier topic, is on, alignment and unification. So there's a lot of stuff we're doing in this, in this topic at the moment and too good to go. The first thing is, just noticed that we had a lot of tools in the company, that were handling, for example, event tracking, communication, actually a lot of other things as well. But let's take the example of events tracking. So, you know, we'd moved over from, using, Google Analytics to using something called Piano, to measure things on the website because the website team was different from the products team. Mhmm. We always have these kind of weird things in the company. And it didn't make sense that everything was being measured separately there. But, of course, as the two teams aren't working closely together, the the marketing team, which handles the website, and the product team, you know, no efforts were made to unify this process. So people just deal with the mess of data. Again, this is where the analytics team needs to get out of its comfort zone and realize that, you know, working with multiple systems is gonna result in data messes. And, it's really easy to undervalue the time that is lost dealing with, you know, broken data, dealing with a poor, join between different datasets. And I can tell you this join was really poor between piano and amplitude data. But, of course, for the company as a whole, you know, the acquisition funnel, you know, is is super critical. Right? This is where a company is spending a lot of money, and the website is one piece in that acquisition funnel. Right? So creating a clean funnel is obviously critically important for the the the marketing team. But the product also benefits because they're able to share the right insights. They're able to understand also the impact of maybe changes that they're making on different people acquired from different sources, and this can be really, insightful. So this is, again, one of these cases where it's really important to kind of see where the value is, talk with all the stakeholders. You'll find despite the fact that it doesn't align with anyone's OKRs, they all see the value on this, and you can add it as an OKR for everyone because everyone's so happy with the value that can be created here. The second point here is, like, something that I'm, working on in count. Unfortunately, this is, something that's still in beta in count at the moment, but we're looking at centralizing, all of our definitions. Until now, we had a bunch of different sheets, which all gave verbal descriptions of what a metric might mean. And I think it's easy to see from outside of analytics sorry. It's for someone outside of analytics, if they see these definitions, it it often looks very clear. It looks like you defined things perfectly well. No problem at all. But when you actually look under the hood and you look at how analysts are calculating things, you'll realize that, of course, if you're calculating a metric from different datasets, these different datasets might have different filters in them, and you can end up with, you know, quite different results. Maybe on the surface, it looks okay, but when you drill down, suddenly you're getting wildly different results. So I think creating definitions not just on a descriptive level, but on a sequel level is is really critical. And, yeah, we're looking to do this, in the new count framework, as soon as everything's set up, and just democratize, you know, the definitions, get together as a team once in a while so that we can review these definitions. And, analysts, again, I think are quite excited about this work, because they realize how much easier it makes their lives. They're gonna be able to talk, on the same wavelength. They're gonna be able to peer review work much faster because they can see where everything's coming from. They know the definition of everything down to a certain level, so they have to do a lot less work to understand how something's been implemented, how something's been calculated, and check everything's okay. And I think the final piece here, I don't know how much time we have now. So the final piece here is just on, unified process, I guess, the documentation piece. So here, we're using, actually a combination of getbook and counts to define, our experimentation process. And I think this is one of those things that's really easy to overlook. On the surface, things like AB test look very simple, particularly to maybe the, very senior layers, in in the data organization. But anyone who's done experimentation knows just how many problems you can encounter and just how many analysts miscalculate, you know, the significance of a test. Maybe they look at five different metrics, and they don't realize that you actually have to recalibrate your sample size calculation if you're looking at more than one metric. So that's one side of this. Right? Ensuring that all the analysts are aligned in how they calculate things. Now we have confidence and trust in the data. But the other side here is just making sure that we have a very clean, simple presentation of results. So every experiment will be done with different alphas, different betas, And it's actually very hard for stakeholders, even if they know what they're talking about, to really understand the results of an experiment. So I think it's really critical here to figure out how you can create an aligned method of, communicating the results of an experiment. So you can, for example, present a confidence interval, always the same ninety five percent confidence interval for the experiment, and you just explain to stakeholders that it's not black and white. We should not just say statistically significant or not statistically significant. We need to look at, it it seems to be statistically significant or it's close to it. Right? Or it's very significant. Right? We can almost confirm a twenty percent uplift in experiments. So sometimes by creating that bit of ambiguity, you actually create a lot more value and you actually create a lot more clarity, but you need to make sure that every analyst is presenting the same way so that stakeholders, you know, learn to trust your way of doing things. And over time, everything should get much easier. Alright. That's a a really good example. I'm excited to see the, the canvas as a result of that or the, you know, the the documents. We can start to to share that around as well. Unfortunately, I think we are out of time, but I do wanna make sure if there are any questions from the, from people listening, please, now is a good time to ask any last questions you have for Dan. I know there are a few other examples that you had prepped, Dan, I think are are really great ones. You had to think of a way to make sure that those do can still share those stories. I think they are very relevant, and people would like to see them. But while we wait for question coming through, I just have one, which is, you know, looking at everything that you've discussed is obviously a lot this touches a lot of different things. Right? I've talked about, you know, from understanding business strategy, like getting involved and being aware of what other people do, being very, value centric, helping develop your analysts and the skill set, all the way down to kind of thinking about metrics and consistency of presentation and stuff like that. For someone who's thinking ahead to next year and then maybe thinking about their data strategy for next year, where do you think that maybe they should start? Or where where should they start thinking about how to choose where to start, I guess? Because it can feel like it's it touches a little bit everywhere. So, yeah, any advice for someone thinking through that? It's really hard to pick one thing. Right? But I would say the key is maybe, where we started off. So just running back to the the earlier slides. I think, from this, making this transition to the value obsession, making sure that your analysts are partnering with the, the PMs, that you're really, like, transforming the mindset of analysts because, you know, there's only so much that you can do on your own. But if your analysts feel empowered, if they, you know, start transitioning towards, you know, more like leaders in the organization, then they're able to do all that heavy lifting for you. Right? Ideally, they're able to handle that transformation to, you know, the right type of communication with the organization, and they're able to identify these, you know, potential, increases in value from, you know, integration or unification. So, yeah, I would say that's the the the most important point here and maybe the hardest thing to execute. But, long term, it's gonna reap a lot of rewards, I think. Yeah. I I agree with that. I think getting to that point. And I guess part of that as well is to do this, you really need to understand what the rest of the business is doing and and what their strategy is and stuff. And I and I see that a bit of of data teams not even really knowing what maybe the high level business goals are for next quarter or something like that. So I think in order to do this, you really have to get clued in to what everyone else is is up to you so that you can be really value obsessed, to know what that value is. Dan, thank you so much. It doesn't look like there are any questions for now. I think there will be after the fact. So in the the follow-up email, I'll make sure people have a place to to ask you some questions, and share this recording around. And, yeah, if you do think of anything else, feel feel free to reach out, to me or to Dan if you have any questions for us. Dan, thank you so much for your time. It's been great. Thanks, Dan. Pleasure.