SPINS’ Brian Ritz on Getting Insights and Storytelling with Data

Episode 13 July 22, 2022 00:30:34
SPINS’ Brian Ritz on Getting Insights and Storytelling with Data
The Georgian Impact Podcast | AI, ML & More
SPINS’ Brian Ritz on Getting Insights and Storytelling with Data

Jul 22 2022 | 00:30:34

/

Hosted By

Jon Prial

Show Notes

Working from data has shifted from aggregation to getting helpful insight. With that change, it has become increasingly important to build an open-minded team with effective leadership. 


In this episode of the Georgian Impact Podcast, we talk to Brian Ritz, Director of Data Science Solutions at SPINS. SPINS connects wellness brands to their retail and omnichannel data, and they've partnered with over 300 brick-and-mortar retailers so far. Bringing advanced analytics to their clients allowing for deep fact-based decision-making.


You’ll Hear About:

 

●  How Brian has seen tech develop over the past few decades.

●  Ways to improve at getting more useful information from data.

●  Ensuring you don’t get caught up in just the tech and data science side of product development.

●  Will we reach a point where there is too much data?

●  Data moats and how they can provide a competitive advantage.

●  The need for companies to change or pivot.

●  The importance of communication when striving to be a good leader.

●  Diversity in your data and data team.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: You. Hi, and welcome to the Impact podcast. I'm your host, John Pryle. Today we're going to be talking to Brian Ritz, director of data science solutions at Spins. Spins connects wellness brands to their retail and omnichannel data, and they've partnered with over 300 brick and mortar retailers so far. Spins brings advanced analytics to their clients, allowing for deep, factbased decision making. As a company that's been around for the last 20 years, Spins has worked hard to innovate amid constantly changing trends, client expectations, and new technologies. In this podcast, Brian, who's been at Spins for the last four years, talks about how working from data has changed from aggregation to insights, building a team that stays open minded to new solutions and how to be a leader. Brian, welcome. You've been in the business for a while. I'd like to get a sense from you of what's changed in the past 20 years. And what I mean by that is, tech and algorithms really were a means to an end, as opposed to now, because of machine learning and other elements, it's now a critical part of defining kind of what the value is, what the product is going to be. How have you seen tech evolved over the past couple of decades? [00:01:19] Speaker B: Yeah. So one thing that stands out that's changed is what used to be called analysts, and they're now called data scientists, I'd say. So I think just the term data scientists is a newer term. I started my career at 84 51, and I was an analyst and the same 84 51 was what 84 51 is. They're essentially the data science arm of Kroger, one of the largest retailers here in the US. [00:01:44] Speaker A: Grocery retailers. [00:01:45] Speaker B: Yeah. They've got a whole company just dedicated to data analytics there. And there were no data scientists. When I joined, it was all just analysts and senior analysts. And then I think this was maybe either six or seven years ago, roughly. They changed. Everyone was a data scientist. And their jobs, I don't think, necessarily changed. I wasn't at the company when they made the switch. But there's this term, data scientists, and it's an extremely broad term that ranges from the people who make Siri on your phone, AI to me just out of college, running SQL queries on retail data, on Kroger data, essentially. And it's kind of lumped together. And I've had some good conversations with some folks at spins and externally about starting to separate some of these apart, these types of jobs apart. We haven't done this, that spins, but I've kind of had the thought that for more of that. What used to be that analyst role? I almost think the term insights engineer is a little more apt to that even than just data scientists, because what you're really doing is storytelling, and you're coming up with an insight and backing up that insight via data. Whereas machine learning is a little more of a technique for prediction, essentially. It's a little more closely tied with finding patterns in data, exploiting those patterns to make predictions on new data. So I haven't really seen the term insights engineer. And on our team, we don't quite necessarily have that role that I would call an insights engineer, but I always thought it was a good term for that type of role. What used to be the analyst? [00:03:34] Speaker A: You might have had an analyst to go look at the data and figure something out, but you sort of had to have an idea of what you were looking for. Show me the pattern of a certain. Tell me something about a certain set of customers. So let's talk a little bit more about how to get better at getting useful insights from data. I've been going to grocery stores for decades with a loyalty card, and I don't feel like the data that is being used does anything for me. I mean, I'll buy four boxes of pasta at the checkout, and I get a coupon to buy four more boxes of the same pasta. I mean, tomato sauce, anyone? And data does affect, and should affect merchandising, too. And we all know by that urban legend in the past where you should be putting beer close to diapers for the dad that might be in there buying diapers. We do have this data, though. So how can you help on coupons and merchandising, et cetera? [00:04:24] Speaker B: Yeah, absolutely. Provided the data. Yeah, absolutely. Right now, we're working with recommendations for what to put on the shelf. And what should retailers carry that they're not currently carrying is a big question that we have, because spins has the largest product database in terms of attributes and products of any of the data providers, essentially. And so it's a big competitive advantage. We can bring all that product intelligence to bear on problems like what should I put on the shelf or what should I promote? What trending attributes are out there that I'm not currently carrying, that I'm missing out on and could expand my category. [00:05:04] Speaker A: We spent a lot of time thinking about privacy and what I want to share, and I would love for my grocery store to say, we're not targeting you. You tell us what characteristics of you as a purchaser you're interested in. Like you would say, I'm interested in bulk. I need bulk purchases. So just show me bulk. Or I'm interested in boutique, or I'm interested in a certain type of ethnic food or certain sugar free drinks. If I put drinks up just by default. I'm almost thinking of the days of a faceted search. We clearly don't want to go back to those days, but if there was a way that I could do a faceted way of kind of identifying myself and I was able to keep it, well, we can think about web three versus not, but I would keep it to myself. Here's what I would love to share about me. What can you tell me I'd like to buy? We might find an interesting give and take between the decentralized data, which you must have, and then the decentralized attributes perhaps of a purchaser. [00:06:01] Speaker B: Yeah, that's a good approach. I like that for a couple of reasons, I would say, one, the privacy that you mentioned is protected there, and then two, it's much easier problem on the data end if you're doing there. Too. True. [00:06:18] Speaker A: It's simpler. [00:06:20] Speaker B: Even just a straight filter in that case would get you what you want. You can do more complex things there too, data wise, certainly of correlating across different users that all say the same thing about themselves. And it's kind of what we were talking about earlier with the, with the beer and diapers only you can kind of cross correlate customers in that same exact manner in that. But a lot of times with data, the 20% gets you the 80%, essentially. [00:06:51] Speaker A: That would be usually valuable to me if I did the faceted and said this is the type of thing, in other words, bulk purchases. If it turns out that one's a better price performer or whatever, and you find that hundreds of other people in your selected category are interested, that you can clearly offer that there is a linkage between the data and some of the attributes of the purchasers. Yeah, so I'm a product guy at heart, and I'd love to dive more into the product side of development. How do you ensure you're not getting caught up in just the tech and the data science aspect of it, that you end up building something that really isn't helpful to customers? [00:07:29] Speaker B: It's funny because people have told me, I feel like I kind of do a lot of product like things in a way, as the manager of the data science team, essentially, because a lot of times you got to really diagnose the problem you're trying to solve. It's really easy to just sprint past a problem with something that doesn't address the underlying customer needs, even just identifying the right customer, is kind of the first step there, too. And then a lot of times it doesn't necessarily necessitate data science per se, machine learning per se. I think, again, like the data science is the wide term, but a lot of times, if you really understand the problem, like you were talking about earlier, the faceted search, or tell me if you like organic or not and just filter on it, is the right solution technologically and from a product perspective as well. [00:08:23] Speaker A: Like you say, do you ever see an issue then, because you talk about how much data you've got, is there ever going to get to be too much or the systems can handle it, or you need to add more modeling to it, because obviously you're collecting more and more data and maybe even going to collect more data as it relates to potentially users and things. So is that a good thing or a bad thing in your case? So talk to me about your view of just the collection of just the raw collection of product data, because you're a product person, right? [00:08:53] Speaker B: So it's a trade off. It comes down to a lot of trade offs. I mean, storing data has a cost. It does have a cost. A lot of times people tend to gloss over that. But the bills do add up, even just for storing that data, especially across time. You're slowly bleeding pennies by the gigabyte there. At the same time, you never know what in the future you might need. So it comes down to really probably knowing a roadmap, path forward of where you want to go and what you want to measure. I would say so really having a vision for your product and taking your best shot at what the relevant data will be because, yeah, that data has a cost in terms of storing it on Google Cloud has terms of cost of developers got to set up this logging thing and maintain it, and the stuff just adds up. He could be developing features or she could be developing a whole new product or something like that. [00:09:56] Speaker A: You need a vision of version two and version n. Yeah. Have your roadmap. Right. I think the key to good data collection then, would be a good roadmap. Fair. [00:10:07] Speaker B: Yeah. And knowing the roadmap, but also knowing the potential forks in that road, to stretch the analogy, too. And what data you'll need to make those forks. I mean, no one's omniscient. It's a judgment call. That's why product is hard. You're defining your problem. You're defining where you might need to pivot or what even the potential avenues might be and what you would need to make that decision, sure is where you need to be. Yeah. [00:10:35] Speaker A: Now, as a business that's trying to always, obviously, maintain a competitive edge, then talk to me about one of the most interesting terms that came to be in the past few years. Data moats. [00:10:44] Speaker B: Yeah. Data businesses are some of the most defensible businesses. I would say one of the most defensible business models. Nothing's ironclad, but as a data business, and a data business would be essentially a business that is built around a proprietary data set of some sort, whether that is proprietary because of you're the only one that has the technology to acquire it, or if you're the only one with a unique offering to the source of that data, let's say, or network effects, there's different ways to acquire it. But businesses with data moats, I would say it's a really good spot to be as a business because it's called a moat for a reason. The hurdle to overcome that it requires so much as a competitor, it would require so much capital and investment to dislodge that, because changing technology is hard. You got to figure there's so much dependency built into the technological system of the data business's customers, it's really hard to find people that want to compete against that. So it's a really defensible business. [00:11:51] Speaker A: Yeah. I don't know if it's true or not. I don't know if it's another urban legend saved on that funny rat hole. I think it's going to be hard to compete, even though they're not necessarily good at it yet. Hard to potentially compete with Tesla on self driving, because they've been collecting data on every car on the road, whether just the cameras are running and they're collecting all the data and they're sending it up all the time, other than an Nvidia, which I know for a long time has been collecting data, but I don't know how many Nvidia cars there are, if they're all in the labs or people are using it. But clearly Tesla's got an amazing data, may give them a competitive advantage. It's like a second order competitive advantage they can now deliver. Maybe better self driving, potentially, yeah, that's another level. [00:12:38] Speaker B: I'm seeing it now. I didn't necessarily see it, but when you talk about machine learning and AI, you need a lot of data for that. And so the only companies, I should say, that have the ability to generate those machine learning algorithms to a certain standard would be the people with the data. And that, like you said, second order keeps going, keeps that competitive advantage where you not only don't have the data, but then you don't have the machine learning algorithms or the AI that was built on that. And that AI is in turn making the product way better, in a way virtuous cycle. [00:13:18] Speaker A: So we've been talking a little bit about product. Let's stay down the product path just a little bit. You've been in the industry over a decade. Talk to me about companies having to pivot, making changes along the way. Because for whatever reason, usually financial or technological, how do you view the fact that companies do need to pivot, startups need to pivot, and even existing companies need to pivot or get potentially disrupted by somebody else. What's your sense of that? The need to change? [00:13:45] Speaker B: No one's assumptions are going to be perfect. I do think we somewhat in our culture overplay the agnosticism that one must take with product. I think it's a little overplayed that we must learn everything. I think a lot of, especially if you have decade, two decades, three decades of experience, you can bring a lot to the table that you've already learned, but no one's perfect. And yes, a lot of times, hopefully it's not a big pivot. Ideally it's not a huge big pivot, but yes, you need to pivot. And that's where having a sound technological base and a sound development lifecycle and sound development practices would really pay their dividends there. When you have to pivot, it's nice to be agile, to use an overloaded term, when things are going great, but you must have that agility when you have to pivot, because otherwise you're not going to be able to do it and your business is not going to survive. [00:14:49] Speaker A: But to get that solution right sometimes, and maybe there's an arrogance of history at the same time versus the ability to disrupt it. I'm not sure that there's a right or wrong answer, but sometimes you go into do a solution and you think you know what the right solution is. You're not open minded enough. There's a little confirmation bias in there. Talk to me a little about how you make sure your teams are really thinking. Maybe I'll use the phrase looking left and right as they think about where. [00:15:15] Speaker B: To take a product. I think I've been pushing the team to have some more of that confidence in their opinions about what the data should look like and what their best methodology would put forth in a way, yes, I understand the point. Confirmation bias is there. It is really easy. And it might just be my personality that I don't have that tendency as much as the tendency to doubt and skepticism. It's probably more individual of you have to know yourself. Are you more prone to that overconfidence and blindness and bias into your own assumptions, or are you more inhibited in your ultimate efficacy due to an overly skeptical outlook on things or overly skeptical and an analytic outlook that doesn't allow you to ultimately deliver? At some point, every business is a bet. It's a bet on some assumption. You have to check that assumption, and you have to have the right assumption. And it's not only, by the way, the assumption that customers want my product, but there's assumptions like I can deliver the product and I can market the product, and I can explain the message about my product to the right customers. And there's a whole wide breadth of assumptions, totally aside from the actual thing you're selling. That I think is somewhat, at least in parts of my career, I've overlooked that. But ultimately, you're making a bet that this would work. And those assumptions, you do everything you can to make it valid, but you don't have infinite time. You got to roll a dice. There's a debate there where how much is luck? How much is skill? I come down a lot more on the skill thing. Just because people who are good at things are really good at things, like the elite of certain fields, whether that's major league baseball or whether that's South Bend Symphony orchestra here, or whether it's our CEO, Tony, they are good at what they do, and they're legitimately there for a reason. A lot of times, too, in our culture, sometimes there's a skepticism of expertise. In a way, that's the narrative. I think that's fair to say in the culture and within the domain that those people are experts in, they are legitimately there and good for a reason. [00:17:44] Speaker A: I want to reiterate, you talked about assumptions, and there was two elements that I think were really great. You're right that there's bets, and the bets are built upon assumptions, and they might be right or wrong. And the key is to check your assumptions, not all the time, get to work, do your job, but periodically step back and check them. You also did an amazing thing, which is, by the way, the job of a CEO is incredibly difficult because across the breadth of an organization, there's a lot of things going on, and there are some clearly assumptions going on in each of those organizations. And hopefully they fit together really well as well. I hadn't really thought about checking your assumptions. Not check your assumptions out the door, check your assumptions all the time. [00:18:27] Speaker B: Yeah, definitely. I mean, you're always gaining feedback. You're always trying to look for the explanations for things you see. Like, I saw sales on Amazon decline 10%. What would explain that? What went there? Maybe you ran out of stock. Maybe there's nothing wrong with your product listing. Or maybe there is something wrong with your product listing. Maybe the customers that were there even just choosing when to dive in deep, what things to let go of. It all comes down to prudential judgment, essentially, of where you want to take your business and what's important at the moment. No one's perfect either, but you got to have a feel for what's important in the moment. [00:19:05] Speaker A: I think it's a great definition of what leadership is. No matter whether you're leading a marketing team or a dev team. That's really the key to being a good leader, to recognize that that needs to be done, and when to panic or not panic and when to ask the questions. I think it's a great discussion. [00:19:23] Speaker B: Yeah, leadership is an interesting tactic. I think some of the, I don't want to come this to come across as like a culture critique, but some of it, I think where I think some of our culture misses the mark on leadership is it presents in some ways, an overly technical view of management and leadership. In a way. You can tell me this is a caricature and we could have that story, but the caricature I'm drawing is the manager is the person who has the spreadsheets and makes sure the timesheets are filled in and executes this agile methodology at this company and makes sure that everything's in order and just kind of executes technically, in a way. When leadership is much more broad than that, it comes down to character. It comes down to caring for your people holistically and really serving as a locus point of direction and vision in a way. [00:20:33] Speaker A: So the way I've always looked at kind of organizational structure, you described, when you talk about the technical side, that was a manager. So there's a manager. [00:20:42] Speaker B: Yeah, that's okay. [00:20:43] Speaker A: And there's a manager. You need a leader. If the leader says, everybody walk out and take a left turn, and that's a force of will and a personality and a skill set, and I want to come back to you and talk about communication. So leadership is an element, and then the third one is vision. And you could be all three in one person, I think you'd be very rare. But if you need to acknowledge that you need those three piece parts, that will get there. So I do want to come back to you, talk a little bit about communications, because it's much more than just management. And the only way to be a good leader and have vision is to be able to communicate that and the challenges that you see the managers are faced with in terms of communications. [00:21:26] Speaker B: Yeah, this is always a tough one for managers and leaders. I've heard the advice and I've seen it in practice. Repeating yourself, repeat yourself a whole bunch of times, because there's always, especially at a company size of spins, even companies smaller, you're always going to have new people. There's always going to people that have heard it for the first time, the same thing for the first time. And I've heard it from, you know, I'm glad he doesn't. Tony's our CEO, Tony Olsen. He's. He's always iterating his vision, and it's really good. And I've always thought, like, man, he's got to get tired of saying that. But he doesn't. I can tell he doesn't because he just wants it so badly. There's just pure repetition, there's clarity in what you're doing. And this is, again, a tough one. I've sometimes thought, on a technology level, when you're writing software, the most clear explanation of what you want is the actual code that you're trying to write. That's actually the specification. When you're coding, you're kind of just creating specifications encoded of what you want. And as a manager or leader, you're not going to go down to that depth, but you want to make it clear enough. Technology is one place where I think going down to even some of the times, the level of pseudocode or things that you want to see is not that bad of a thing. I think maybe in other areas it might be a little too much or a little too in the weeds. For people. Clarity is a tough one because words, I think, mean different things in different contexts to different folks. And you're always kind of searching for the best way to address the person you're trying to address on their terms at the same time, creating a shared understanding between, which is a challenge. Multiple people. That's the challenge, actually. Yeah. [00:23:21] Speaker A: I've always said there are two types of presenters, and there are presenters that are somewhat hit or miss because they say it the same way every time. And when they say it right. And they hit the right audience. It's spectacular. And this goes all up to very senior execs I've seen over the years, or someone that actually sort of figures out the audience either in advance or real time, by looking at what's happening and then tailoring the message for that audience to get a much better acceptance of it. And I think there's a blend. Of course, there's never one thing. It's a good skill, it's a blend of how that works. Right? [00:23:54] Speaker B: Yeah. I mean, just that skill that you mentioned, just the ability to communicate to different folks in their own language and meaning the same thing or meaning just enough. This is kind of how I imagine real estate developers. You got to bring five different parties, you got the government, you got the money, you've got the construction, you got to bring all these people together, kind of imagine them. You're always talking on the same language, you're kind of creating the same story. This building is going to go up and it's going to cost around this much. Sometimes shaving edges off here and there to the different stories, but you're trying to triangulate to the shared understanding between all these different folks involved in this deal and talking in their own terms. And just that ability alone, that's a career. That's amazing. You can nail that down. Yeah. [00:24:41] Speaker A: What are your thoughts, then, in terms of you've got to both deliver to your right audience, but you're asking for them to be diverse in their thinking that one of the flaws in data science, of course, is that maybe you don't have enough data or you don't have enough diversity, is you're thinking about the solution that you're going to deliver. Again, that we talked early on about the breadth of what you offer. So talk about a diversity of data, diversity of a data team, just how that means, because there's so much to get this right and there's so much that if you get a massive amount of data, but it's not diverse enough. [00:25:14] Speaker B: Yeah. I think it always helps with communication like this to different parties and talking on their terms. Cut through the noise of all the abstraction of language as quickly as possible and get to the concrete of, in this case, what shows up on your screen. I always kind of use that term. What do I expect to see on my screen when x is finished or after 60 days? What are we going to see on my screen that I can't see right now? Where am I going to go? Where am I going to log in to see it? What am I going to enter in and what am I going to see? It helps, I feel like, to go in that concrete perspective, because then you're actually thinking in terms of expectations of what you want to see, and you're thinking in very concrete terms of actually what's possible. And you can start conversations like that. Even better than just conversing about that is actually showing something concrete. So that's where the MVP story really gains traction. Sure, showing mockups are good. I've pushed the team in a way, when we're developing products. I've been pushing the team recently to upfront deemphasize model accuracy or algorithm accuracy up front to build the full product and show something concrete with the knowledge that there's a pretty linear path towards improving that prediction. Let's just say in terms of accuracy, but putting that prediction in the context of a product. By the way, this has the advantage of aligning all of engineering, product and data science at the beginning of a project, because you don't want to waste a lot of cycles just optimizing the accuracy of an algorithm and then go integrate it. That's another handoff. Focus is kind of lost over time between the teams that you got to come back and then regroup and integrate. We've had some good success with that. I feel like it's been nice, really deemphasizing up front the model accuracy, getting a full end product that does the job, but we know we can do better. And then you have a really nice, neat interface as the data scientist to iterate on your model and you know exactly where that file is going to go or where that model is going to go, and it gets picked up by the system. [00:27:28] Speaker A: Just drill to make sure I understand it, because there's two ways I could see this going. So I get, don't go deep, deep, deep, and spend forever in the model, because that's just a piece of the solution. [00:27:37] Speaker B: It's all. [00:27:38] Speaker A: Got it. So you need to have all the scaffolding up start to finish. But would you say what I want to see in my screen after 60 days is analysis of the snack market, and then later you could show me the juice, or is it the breadth of the store? So when you talk about show me, how do you think about that? [00:27:58] Speaker B: Yeah, that's a great question. Probably context dependent on what you're trying to see. The risk of just segmenting out. You always want to take out a subset of the problem, but it's like, which subset do you want to take out? Just one category as you say or totally get what you're saying. It's hard to even describe where you're going. [00:28:19] Speaker A: Yeah, I don't think I want to do the snack market because I need to snap snacks next to the drinks. Otherwise I'm missing that. Right? Come on. [00:28:26] Speaker B: That's. The risk is that. The risk is you kind of overfit snacks. I had a conversation about this today and you come up with a solution that's really good to snacks and you go to sodas and it's just not working. Just not having a completely different category. So to have some of that generality, it's context dependent. But yeah, you want to think along. That generality dimension of don't overfit is. [00:28:53] Speaker A: A great takeaway, though. Don't overfit. Recognize that a solution such as yours, such as so broad across what a store has to offer, with the amount of data you have, you got to be careful. Obviously, everything is a balance. You like to talk about communications and how do people get a message and communicate what a lion is to someone that doesn't know what a lion is? Can you just talk about that? [00:29:15] Speaker B: Thinking about people in the Middle Ages in Europe, they didn't know a lot of what we know today, probably including they probably never saw a line. It's probably never traveled to Africa, and he didn't have tv and he didn't really have know that much either. So it's really easy to make assumptions. I guess the takeaway is about historically, whether that's the Middle ages or whether that's your company, like last year. It's really easy to make assumptions about what they knew or their competence in a way. And those assumptions know it's really tough to put yourself in those shoes. Those assumptions aren't always accurate. And it's good to always go through the exercise and really try to put yourself in their shoes, I suppose, and see. But there's just so many. It's just such a different experience. [00:30:10] Speaker A: Thanks for joining us, Brian. It was great that we started chatting about product development and we end up in the middle ages. I really appreciate you giving us the time.

Other Episodes

Episode 12

October 22, 2021 00:25:35
Episode Cover

Navigating the Cybersecurity Landscape with CISO Alex Manea

Cybersecurity is a topic that is not to be taken lightly. We're working with Georgian's Head of Security and Privacy, Alex Manea, to bring...

Listen

Episode 46

November 25, 2019 00:20:53
Episode Cover

Episode 46: Security and Messaging in Asia

It's no secret that Asia is leading the world in messaging with platforms like WeChat and Line. Yet when it comes to security, it's...

Listen

Episode 1

January 20, 2022 00:29:45
Episode Cover

An Introduction to Self-Sovereign Identity with Northern Block CEO Mathieu Glaude

Shouldn’t you own and control your identity? It is yours after all. That's the idea behind self-sovereign identity — the idea that you control...

Listen