Increasing adoption of data products, by design. w/ Brian T. O'Neill

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Increasing adoption of data products, by design. w/ Brian T. O'Neill. The summary for this episode is: <p><strong>Increasing adoption of data products, by design</strong></p><p><br></p><p>Look no further than Brian T. O’Neill’s bio to tell you that’s what he does best.</p><p><br></p><p>Our special guest this week knows that low adoption of data products are enterprises biggest enemy. Not just in terms of quantity, but also quality of these investments. Why are teams so often creating <strong>technically right, effectively wrong </strong>data products? Why do people fail to adopt when it’s them post crucial part of becoming data driven organizations?</p><p><br></p><p>These burning questions have answers.&nbsp;</p><p><br></p><p>Join hosts Juan, Tim, and guest Brian T. O’Neill on this weeks episode of Catalog &amp; Cocktails.</p><p><br></p><p>Key takeaways:</p><ul><li>[00:06&nbsp;-&nbsp;03:30] Intro &amp; Cheers</li><li>[03:33&nbsp;-&nbsp;04:54] What is your instrument of choice and what would your band name be?</li><li>[04:59&nbsp;-&nbsp;07:55] Honest no BS definition of a data product</li><li>[07:59&nbsp;-&nbsp;11:17] Alternative definitions of data as products</li><li>[11:18&nbsp;-&nbsp;14:11] Data products can be many things, and definitions are broad</li><li>[14:11&nbsp;-&nbsp;19:39] How human centered design interplays with defining data products</li><li>[19:47&nbsp;-&nbsp;25:17] Data therapists, knowledge scientists and engineers</li><li>[25:21&nbsp;-&nbsp;33:48] Where does the burden lay, and with whom</li><li>[33:49&nbsp;-&nbsp;38:35] Brian's perspective on data product management versus software product management</li><li>[38:38&nbsp;-&nbsp;43:59] How do you achieve good adoption?</li><li>[44:05&nbsp;-&nbsp;46:39] What is the best way people can start learning about human-centered design?</li><li>[46:42&nbsp;-&nbsp;47:20] Can dashboards be data products, and can machine learning be data products?</li><li>[47:32&nbsp;-&nbsp;49:45] Should companies be investing in data managers</li><li>[49:45&nbsp;-&nbsp;52:45] Value and adoption, should companies track data ROI</li><li>[52:58&nbsp;-&nbsp;58:20] Takeaways</li><li>[58:23&nbsp;-&nbsp;01:02:09] Three questions</li></ul><p><br></p>
Intro & Cheers
03:24 MIN
What is your instrument of choice and what would your band name be?
01:21 MIN
Honest no BS definition of a data product
02:55 MIN
Alternative definitions of data as products
03:18 MIN
Data products can be many things, and definitions are broad
02:52 MIN
How human centered design interplays with defining data products
05:28 MIN
Data therapists, knowledge scientists and engineers
05:30 MIN
Where does the burden lay, and with whom
08:27 MIN
Brian's perspective on data product management versus software product management
04:45 MIN
How do you achieve good adoption?
05:21 MIN
What is the best way people can start learning about human-centered design?
02:34 MIN
Can dashboards be data products, and can machine learning be data products?
00:38 MIN
Should companies be investing in data managers
02:13 MIN
Value and adoption, should companies track data ROI
02:59 MIN
Takeaways
05:21 MIN
Three questions
03:45 MIN

Speaker 1: This is Catalog& Cocktails, presented by Data. World.

Tim Gasper: Hey, everyone, welcome to Catalog& Cocktails. It's your honest, no BS, non- salesy conversation about enterprise data management with tasty beverages in hand, presented by Data. World. I'm Tim Gasper, longtime data nerd, product guy, customer guy at Data. World, joined by Juan.

Juan Sequeda: Hey, I'm Juan Sequeda. I'm a principal scientist at Data. World. And as always, it's a pleasure, it's Wednesday, middle of the week, end of the day, and it's time to chat about some data. And today, we have a guest who has been talking about data products before data products was even a thing, people were talking about it. That's Brian T. O'Neill, who's a founder and principal at Designing for Analytics. He helps data product leaders increase adoption of machine learning analytics through human- centered design. And if you don't follow him on LinkedIn, you've been really missing out because as I mentioned, he's been talking about data products before anybody out there and connected to users, which I love. Brian, it's great to have you here. How you doing?

Tim Gasper: Welcome.

Brian T. O'Neill: What's up? Thanks for having me here.

Juan Sequeda: Awesome.

Brian T. O'Neill: And I do have my cocktail in hand.

Juan Sequeda: Ooh, look at that special... tell and toast you're rolling right in here. What are we drinking and what are we toasting for?

Brian T. O'Neill: People will have to write to me if they want to know what's in it. That's secret. But I'll tell you a little bit about the mug. This mug is actually... so I'm into tiki drinks as you can see. This is actually a mug from Crazy Al. So, if you check out tikimania. com, he is a wonderful artist and carver. He's actually in a band as well. So, they actually carve, he takes a log on stage and it's like punk slash like Hapa- Haole music. And he carves the tiki with a chainsaw on the stage while they're playing and dances around with coconuts. He's wild, but he's actually a beautiful artist. So, he does these at tiki mug scale and palm tree scale. So, I thought I'd bring something a little creative on the show.

Tim Gasper: That is so cool. And for those that are going to be listening to the podcast, you're missing out. You got to check out the video as well. There's a really awesome mug that Brian's shown us right now. So, you got to check it out.

Juan Sequeda: And what are you toasting for today?

Brian T. O'Neill: I'm toasting you guys for having me on your show.

Tim Gasper: Whoa.

Brian T. O'Neill: But I want to know why is it not Catalogs& Cocktails? Why is it singular Catalog?

Tim Gasper: No, maybe it's a verb, Catalog& Cocktail.

Juan Sequeda: Actually, nobody has ever asked that question in-

Tim Gasper: But it's Cocktails. I don't know.

Juan Sequeda: Okay. A little bit of history around this stuff. Remember we started the podcast beginning of the pandemic and I wanted to do something with wine because I like to drink wine. But then, I saw somebody doing some afternoon wine thing. I'm like, " Ah, they screwed."

Tim Gasper: That idea is taken.

Juan Sequeda: I can't take you as a wine thing. And then, we're doing cocktails and because we work at Data. World, which is a data catalog, we said Catalog& Cocktails, and that's really the story of it.

Tim Gasper: There you go. All right.

Juan Sequeda: I don't know. Anyways, but-

Tim Gasper: Now, we get to drink every Wednesday. So, that's fun.

Juan Sequeda: We have right now the Gratefulist Vodka with some Austin, with Rambler, which is some Austin sparkling water. And I poured in some agave syrup. I have everything here, agave syrup. And I did some orange bitters and I just put that together. It's just a fancy, little vodka soda, not a big deal.

Tim Gasper: There you go. All right. Cheers.

Juan Sequeda: Cheers. And I'm going to announce it today, cool thing we announced our AI Lab here at Data. World. So, we're heading, I'm the head of the AI Lab and we just do really cool stuff. So, cheers to that. All right, cheers.

Tim Gasper: Cheers.

Brian T. O'Neill: Congratulations.

Juan Sequeda: Thank you. We got a warmup question. What is your instrument of choice and what would your band name be?

Brian T. O'Neill: Well, we jumped about this early because you have a conditional, what would my band name be, and I do have a band. It's called Mr. Ho's Orchestrotica, and that's a whole another story.

Tim Gasper: Nice.

Brian T. O'Neill: Instrument of choice is tough. I'm a percussionist, that's my training. It's plural, right? It's plural by definition, but I'm a particular fan of ethnic tambourines from around the world, so I specialize in that in terms of hand percussion with the soft spot for the pandeiro, which is the tambourine that's played in Brazil and the samba music and the Chooriyan music. And it's played in a bunch of different styles. So, that's probably my favorite.

Juan Sequeda: I have at home bongos. I grew up with salsa.

Brian T. O'Neill: There you go.

Juan Sequeda: And I'm very extremely amateur about it. So, that's my instrument of choice. If I would be in a band, I would be in salsa band. I'm not being creative right now with the names. I'll probably come up with one later. How about you?

Tim Gasper: Well, I play piano, so I would probably be in maybe'80s synth or something like that. I don't know. I'm not very creative either, the Maniacs or something like that.

Juan Sequeda: All right. Enough warmup here. Let's go. Let's get in. Brian, honest, no BS, what's your honest no BS definition of a data product?

Brian T. O'Neill: Sure, sure. I'm going to give you the imperfect one that I recently defined on my podcast actually, and I say it's imperfect because I shipped it before it's done because I was tired of talking about it without defining what it was. It was one of these amorphous things in my head. But I'm going to give you what I call the product D definition, which isn't a word, but I'm going to use that as a word. So, product with a Y at the end. And what do I mean by that? A data product is a complete end- to- end human in the loop decision support solution that's so good someone might pay to use it. And so, the important thing about this is when we talk about a product is I think sometimes when we think about data products, I think we think of product as the end result of labor. It's this thing that gets emitted when labor is done, and that is absolutely not what I'm talking about. That is an output of a piece of technology. What I'm talking about from a product standpoint is it's something worth trading for where there's an inherent value in it and the value is only in the eyes of the people who it's for, the stakeholders, the users, the customers. It has to have some value. And the idea here is that it becomes a data product when it's so valuable, they might pay for it or you could say they might exchange something of value for it. There's probably a fair number of people listening to this that work in data science or analytics capacities serving internal business users. They're not developing a SaaS product, they're not at a technology company selling their wares, their softwares. But this idea can still stick there, which is that the solution we made isn't just an output, it's not the byproduct of work. It's a product because it's so good and it has some inherent value quality to it. That's what I'm trying to push. I know that's not the definition that I think a lot of people think of here and there are holes in it. It will not stand up to every use case that you throw at it. But the spirit there I think is largely missing in the data community.

Tim Gasper: I really like the way that you walk through that and I feel like... you chose your words very carefully here with your definition and I think they all have a lot of meaning associated with them. I even want to make sure that I even captured it correctly here. So, a complete end- to- end human in the loop decision support system that is so good someone would pay to use it or exchange value for it.

Brian T. O'Neill: Correct. Might pay to use. Yeah. I say might, because I know, again, a lot of times, internally, we're talking about funny money, marketing's paying the data science team with funny money, shifted stuff around-

Tim Gasper: I need that dashboard. Well, how much is that for you, right?

Juan Sequeda: But you also said that many people would not agree with the definition that there's like other definitions. What are the other definitions out there? Alternative definition?

Brian T. O'Neill: I can't tell you in simple words. Every time I see something, it's usually an article defining it. It hasn't been summarized down into a couple of sentences. Yeah, I think it was Tom Davenport and Randy Bean, I think they had written an article and they have a similar one. They talk about analytics products, they were drawing this distinction between data products, which may or may not have analytics components to it and analytics products, in their worldview, data products always do have this analytical component. I think the distinction there is that some people think a data product is like, here's this valuable CSV file that we're selling. It's a set of data that gets sold or packaged or reused, but inherently it may not be usable in itself without some transformation or something that's in it. No human being could directly get value out of it in its current form without some manipulation to it. So, I'm not really talking about that, like package data, buy some email addresses for underage, underwater basket weavers. That's not what I'm talking about here. I'm talking about something that probably has some IP in it, some intelligence to it. It's probably a piece of software, some kind. It may not have a user interface, a GUI to it. It could be something that only has an API endpoint to it. But developers are also users and they're also humans. And so, there's a developer experience, data scientists use tools, data scientists can also be users of things as well. So, the human in the loop thing also gets to the human- centered design piece. And when we talk about that, the distinction here is that human in the loop means, to me, that you might be talking about users of the solution that actually touch the software in some capacity. Beneficiaries of the solution, stakeholders, someone that paid for it, someone that just wants to know what, is it working or not, but they don't directly touch it. Third parties who might be affected by the use of the system but are not necessarily in the room having any decisions over how it's used. This is your classic example, applying for credit at a bank and you want to get a mortgage and you are a customer. You're a human that's in the loop of that service, that system, but you don't have any control over what the decisions are about whether or not you get your credit approved or not. You're an affected third party in that case, you're also a customer in that case. Or think about a facial recognition software, running off of a camera system and you have pedestrians going by that are also getting picked up by the system that don't know that they're being tracked here. Those are also humans that are in the loop. And so, the human- centered design perspective has us considering all these factors, not just the technology part of it. We're thinking about that entire ecosystem and what the benefits or the damage is that we can do with such a system.

Tim Gasper: That makes a lot of sense. I really like the way that you're articulating what a data product is because as you're saying, you're broadening that definition beyond just data by saying it's a decision support system. It could be analytics, it could be a dashboard, it could be a visualization. Multiple things could fit that bill, human in the loop, end- to- end, the value aspect of it. One thing that we think a lot about on our side is what are the characteristics of a data product? And Juan and I have been talking to a lot of folks around this idea of data product ABCs where a data product has to have accountability. It has to have some boundaries. What is it and what is it not? What are its interfaces, some constraints and expectations around it? Who are the downstream consumers? Usually if there's not consumers of it, then is it really valuable or is it a product? And then, E stands for explicit knowledge. There's actually some actual context and explicit, not just implicit knowledge around it. Curious if any of that resonates with you and how do you think about data products and defining data products.

Brian T. O'Neill: Well, beyond the definition I gave you, it's already fairly broad. I don't think I have a set of attributes collected the way that you have in that sense. I think some of those could apply to other solutions anyways. That doesn't mean they're not helpful to have those things anyway. I also use things like an anti- goal sometimes in my projects, which is, especially when we're talking about really big complicated systems, it's like, " Well, what is this platform not doing?" Let's talk about the nots because, and this is especially true when you get into enterprise and you're getting into platform development, which is a whole another thing. But if you fall into the trap of we're building something that theoretically can do all these things for all these people, that is often a trap because a lot of times, what happens is it never does any few things really super well. It ends up being a B- minus thing for everybody there. So, I think it's important to have some of those not things in there so that we do have some constraints around it and we can actually focus on delivering some outcomes and not just outputs. Just every two weeks, we're shipping code so therefore, we're having progress. That's not necessarily progress. Progress in the eyes of users, stakeholders, et cetera, is am I seeing a change in my outcomes and the metrics that we said, those are the things that matter. That's what the game is that we're supposed to be playing. So, if those attributes help, then I think, Amen. It makes sense.

Juan Sequeda: For me, the important word you said is the outcomes. And I think we forget about that sometimes. It's like, what is the outcome we're expecting and are we actually getting there better? Are we getting there faster and so forth? And we really need to understand that.

Brian T. O'Neill: Sure.

Juan Sequeda: I think the other aspect is, I mean I want to dive into more the human- centered design because so much of the conversation is always the technology, the tech and the data. I need all these systems and all, blah blah. But we forget about the human aspect and the people. This has been your expertise. Tell us a little bit more about how the human- centered... what is human- centered design and how that incorporates with, coming up and defining these data products?

Brian T. O'Neill: Sure. At its core, we're really talking about... it might be helpful for this audience to think about what it's not. It's not starting with a data set thinking about how could we build a model on it? How could we put some value on it, how could we display it, how could we visualize it and then show it to somebody, taking guesses about what might be in there. I'm not saying there's not spaces for discovery within a data set, which is, I don't know what I'm looking for and I'm going in there. That's fine. The human- centered design perspective is usually about working backwards. It's about working backwards from the humans and then having the technology meet the human need where it's at and not trying to push the technology or the solution onto somebody and try to change their behavior. So, if you're going to ask, the examples I give of this frequently are going to be like, for example, you're building a propensity model, something to help sales teams know who are my best sales targets. Who should I be calling this week? Who should my team call? And so, the human- centered design way of this is probably to understand, a, how do you decide who to call now? Just forget all the data that we have. Forget that we could predict it now, but it's understanding how do you go about doing that now and spending time with users to understand this really, at a fairly deep level. The second thing is that there's probably some good inherent knowledge in that, particular example where the salespeople's, their internal knowledge from doing this probably will be valuable to the solution regardless. But having the users involved in the creation of the thing. So, this is the idea of, we're not designing it for the sales team, we're designing it with the sales team. So, the sales team becomes part of the team that's making it so they're regularly involved in it. We don't go away for six months and then come back and throw something over the wall and show it to them for the first time. And you're like, " Well, it's a black box model," but just trust me, these are the 10 people that you should call and our models, 87% accurate based on his 50 years of historical sales data. They're like, " Yeah, that's nice. I've never seen these names before." And I call my warm leads every week and I'm tracking these deals and my salary's based on this and you want me to call these 10 people. I don't even know, I've never seen these company names before in the CRM. Who are these people? Done. Your data science project has now ended because you didn't form a relationship with them. You don't know what it's like to be a salesperson. You don't know what their fears are. We've missed so much of the human component there. So, the design solution there would be thinking about what are their fears, what are their needs, what is their attitudes about using software? Where are their opportunities, which is like, look at this unnecessary work they're doing. And they go into HubSpot and then they filter on this thing, and then they export this thing and then re- insert back into the thing and merge it with the company data and then divide by two. And then, they come up with this list and sort by date and then they... A to F goes to Juan and G to H names go to so and so, and then they start making calls. It's like, wow, what if we could help you get rid of all that stuff and get you on the phone more? And you would know exactly why we came up with this list because our model's going to tell you why we picked these names for you. Yeah, that sounds really great, and you're tracking it to something that they actually care about that relates back to the way they do the work now. This is how we get them to change their behavior because they've been part of the solution, they're seeing or feeling that the better way to do it is based on things that I complained about or things that I said were really tough to do or they're hard to use or whatever the reasons might be. They can just naturally see that. It's just a better for them situation. Not a better, because the technology part works better because... you know what I'm saying? The value is in their eyes. It's not in our eyes about how good the solution is that we think that we came up with. And this is where I think teams get trapped. And so, in the problem space also, this is where you're also going to learn a ton because you might have all the data in the world to build this model that the sales team could use. But my whole thing is you can't get to the business value land if you don't go first through the user adoption land. So, the first game that data product professionals need to do is they need to play the game of user adoption first. Forget all this talk about business value, making money, saving money, saving time, all that stuff if you're going to skip over the users in the loop and the people that actually need to use it in order to get there. So, first solve the adoption problem.

Juan Sequeda: All right. This is so much to unpack and I love how you went down through this example. That last point is something I wanted to dig more because I've been pushing a lot about show me the money and then make me money and save me money. But before we get there, everything you said, I hear this and it seems so obvious that we should be doing this. And what frustrates me is I go off and we go talk to people and we're like, " This is groundbreaking for them." It's frustrating personally to go see. It's like, " Wait, why aren't you go talking to the people? Why aren't you go talking to the people you're trying to go solve this problem for?" As you said, do you know what their fears are? What are their actual pains that they go through? You're making all these assumptions. You're not building any relationships with them. And I think part of it also is, from a team structure, we're just so focused on having people who are focused on the technology and lack those social people skills. I want to get into roles in that, but I think that's something there. When we started defining these product teams around data, we need to have those types of people who can fill those roles of what we call the data therapists. I call it the knowledge scientists and knowledge engineers who can build the bridges between the people who can talk to the end users and can translate those requirements back to the technical folks who are doing things with the data. Because what I see a lot is that the technical folks, they don't like to go talk to people. Or technologists, they don't like to talk to people, and they're like, " I just give you data. That should work." And I think that's something that we need to break in our industry. I don't know. That's my comment based on what I heard. I wonder what your thoughts about that.

Brian T. O'Neill: Yes, I understand what you're saying, although the teams that tend to perform the best when you look at... and a lot of these ideas are not fresh ideas from me. These are things that mature software teams understand, software product teams understand, the idea of having multiple people, not just the person whose job it is to do that, the product manager for example, going and talking to customers. To be totally honest with you, where I see the biggest impact is usually when engineers or the people closest to the technology development are the ones that are going out at least listening to customer conversations or user conversations or stakeholder conversations if not actively participating. And for people that like, " I don't like to talk to customers," well, here's the great thing, and this is especially true if you're introverted. Most of your job is to shut the F up and listen. It's not to talk. Your job is actually to listen. So, it should be a 20/80 conversation where you're coming in with a list of questions and your job is to keep them talking and you're listening for facts and information that are going to be useful to the product development process there. It's not really to talk to them a lot. And there's all reasons why this doesn't happen. Cultural reasons, like, " Oh, they don't have time. We don't have time to do it, gatekeepers. It feels like we're challenging somebody else." And what I hear from senior leaders is, " I want my team asking these questions." I just talked to a chief data officer about this on my podcast. I believe that was the episode in the person. But he was talking about, " I want them in the room asking these questions," of the stakeholders are saying, " I need K- means clustering so that I can go do X, Y, and Z." And it's like they've loaded in this, this assumed technology solution with maybe a statistics 101 background that that's just slightly better than my own or slightly worse than my own. Because I've never took statistics before. And so, they've loaded it up with an assumed technology. And the worst thing that can happen there is that the team goes off and says, " Great, K- means clustering, coming right up, let's do it." That is a recipe for disaster. It doesn't mean that that stakeholder is always clueless, and there might be a valid reason to use K- means clustering on this project, but the real question is like, " Well, what are you going to do with that information after we do this algorithm, build this model, and then there's going to be a dashboard or something, then what? How will you know if we did a good job, what will be different in your world if we do this? How will we know that the world is a better place or the company is doing better?" We have to understand that stuff first and then we work it back. And this is where you can start to have conversations about whether K- means clustering is the right way to do that, or maybe some other solution is a better way to do it because it will take half as long, half as much data, half as much cost. It won't be quite as accurate, but we'll have, put something out into the world and tested it and a third of the time and we'll start to know if we're making some progress there instead of spending a whole bunch of time and money and maybe coming up with nothing because we don't even know if there's anything there. I went on a couple of tangents there.

Tim Gasper: I love what you're saying here, and just one quick follow- up question to that is there's this interplay that happens, especially between data teams, analytic teams and the broader business organization that they're trying to partner with. And I think you gave a great example here of, " Hey, I want a K- means clustering that does X, Y, and Z." And then, the data team or analytics team might be like, " Okay, well, submit your Jira ticket and we'll see where this fits into the backlog."

Brian T. O'Neill: Do you want fries? You want fries with it?

Tim Gasper: Yeah, right. You want some fries?

Brian T. O'Neill: That's a data drive- through. I'm serious. This is the data drive- through thing. Yeah.

Juan Sequeda: Oh, the data drive- through.

Brian T. O'Neill: It's just like, " What do you want loaded up?"

Juan Sequeda: I love that.

Tim Gasper: I love that data drive- through. So, who's the burden on there to change things more? I mean, obviously it's on both sides, but is it on the data analytics team to be like, "No, no, no. Hold on, let me ask you a bunch of questions and things like that, let's really get to the bottom of this," or do we need to really put more of the burden back on like, it was wrong for them to ask for the burger and fries and they should be instead saying, " I have a hunger that's set up like this." Where the burden lie on trying to fix this?

Brian T. O'Neill: Sure. I think old me would've said, " Yeah, well, if you don't get what you want, it's because you didn't properly form the question and all this." And I think newer Brian doesn't look at it that way. And so, the way I would like data professionals to think about it when they hear a loaded question like this, a loaded request that has an implementation and it has to use Snowflake by the way. And I don't know what that is, but everyone else is freaking using it, so it's got to use Snowflake. But anyhow, when they say it and it has to use Snowflake or it has to use K- means clustering, to me, this is a cry for help and it's actually a gift. It's this stakeholder trying to speak our language, they're trying to help us understand it in words that it's our lingo, it's our data talk. So, instead of looking at, it's like that's a little request because giving people what they asked for is often not the best way to do it because they're giving you what they think they need because everyone else is doing it, but they don't even know what Snowflake is, except all the competitors are using it. And there might be a reason all the competitors are using it or maybe they just have a really great marketing strategy and they've convinced everyone they should be doing it. That's irrelevant. The main thing there is, I see that as it's a plea for help. It's an initial offering and it's the data product's team to go and unpack what's behind that request. Great. So, you want this, why do you want that? What's behind that? How will we know if we did a good job when we get there? And it's to get back to what the unarticulated needs are. So much of the time, you have this concept of present... this is also not my concept, but a presenting problem. The presenting problem is often not what really needs to be there. I hear this all the time. The presenting problem I get as a designer on consulting work is the dashboard needs to be redesigned. We need chart, we need better modern- looking data visualization. It needs to look like Apple would've designed it. So, great. You like Apple design, let's get behind that. What would the Apple design version be like? What would change, what would be different than there is now? And eventually, we get out of the talk about graphic design and visualization and charting types and we start to get to real stuff. Well, our sales metrics are down and we're trying to raise an A round of funding or we're trying to do X, Y, and Z. And right now, it's really hard for the team to tell because they spend too much time in the dashboard and they give up and they're sending me guesses. They put their crap into the decks. There's really a bunch of baloney behind it and they're not really using the data to tell them what to do because they say it takes too much time. Okay. So, it's too hard to get any insights out of the dashboard. That's what the issue is. Yes, that's the issue. Okay. We've completely stopped talking about Apple or what Apple would look like or best- in- class visualization. And that's okay that they're saying that. I'm glad that they think that, we actually have a design problem here. This actually falls into a design thing that's actually a positive. And the same thing I think for data scientists when they hear these loaded suggestions that it has to use AI, it has to use machine learning, it has to use whatever. It's a cry for help in a way. So, let's unpack that and whose job is it? I don't know whose job is it. I like this idea of teams owning the problem space and not just owning the solution space. And Marty Cagan talks about this a lot. I love this idea, which is, and it requires a mature organization, it requires genuinely giving the team ownership of the problem. It also means the problems have to be well- defined or they have to be ready to go out and surface those problems. But there should be some metrics behind it. There should be a before and after state that everyone can understand. It's not like go build a model. It's, we're trying to see if we can get a 12% lift on this marketing strategy before November when the holiday season kicks in. And so, we want to spend half our ad budget automating it and see if the system can automatically create creatives that are at least 12% better than what we're manually doing right now. So, we have some metrics, we have a definition of done, we have a definition of value there. We have something there that we can actually work with. How we do it, whether or not it's like, a machine learning, model or it's just like, " Well, let's just hire four designers that are really good at ad stuff and come up with better... like however you solve it. The idea is that the data product team owns the problem space now and they sense that ownership. I really like this idea too, and another reason why I wanted to go back to whose job is it to talk to users is that if you have these customer conversations going with, your classic power tree would be user experience, product management, software engineering. In the data world in particular, this data science is the fourth leg of the stool. Those four lenses on the problem are all going to be slightly different. The data scientist is going to be thinking differently, asking different questions, unpacking the user's responses in a different way than a designer is, than a product manager might be doing, than an engineer might be doing. And it's really good to get those different ears on it, if not mouths on it, asking questions. I think that's really important because I don't even know what I don't know about data science. It's such a giant area. I'm not a data scientist even though I try to serve them in my work, but it's such a huge field. I know a little bit about engineering. I can write some code and written some applications and I used to do development, so I know a little bit to get dirty there and product and design is really my safe space. That's my space that I know really well, but I know enough to know that I don't know how to ask really good data science questions in English, in the language of my users though that's also important. They don't want to hear about all the technology. They don't care if it's in the cloud, they don't want to hear about pipelines. That stuff is all irrelevant. Anyhow, having those multiple perspectives though is really important. I don't really like all of it getting piped through one funnel person where it's like, because now it's just, especially if that product person or whatever comes back, they don't have the right letters behind their name or VP of whatever or some title thing and they're saying, " Well, I talked to these eight people, " and this happens in the UX world. Well we talk to these eight people, none of them really want this and they don't want it because they do X. Yeah, yeah, but we've been working on this for two years and we're going to cloud because that's the strategy. Well, that's nice, but that doesn't jive with all the information I've heard. And if it's all being routed through one person whose job it is to do, it's just like he said versus what the culture, the ship that's been going at 20 miles an hour for two years. Good luck changing that. But if you've got four smart people on it, and these aren't necessarily always four separate bodies, but these four hats I think are important to have. I think there's a better chance to actually change something when it's not on the right track, when it's like, the team thinks of this, the team was told to own this problem and this is what the team found out. I think there's more clout to that.

Tim Gasper: I love that you're bringing up Marty Cagan and some of his methodology. In addition to being a customer guy, I have a product background and one of the things that we've implemented here at Data. World is the outcome- oriented teams really own the problem, and what you mentioned, the hats. You have to have your technical hat, your product manager hat, et cetera, the design hat, user experience hat. I think one thing that's interesting is how much of product- oriented thinking ends up being applicable not just to the software world but also to the data world with some notable differences. One thing that I'm curious about your perspectives on is the application of product management from software to data. Software, data are not the same, but there seems to be a lot of applicability across the two. What's your perspective on data product management versus software product management? Pretty different, not very different?

Brian T. O'Neill: That's a good question. The main difference, I think that perhaps sets it apart, I think there's more similarities than differences. If all you did was learn the fundamentals of software product management, you eliminated the stuff that may not be quite as relevant like pricing because you're working at a bank and you're serving the risk department. You don't need to worry about pricing strategy, so you get rid of some of the aspects of software PM that might not be really necessary there. Things like marketing however can be relevant even when you're internal. The main differences though I think is that I feel like, especially with machine learning, is that we're not dealing in systems that repeatedly generate expected outcomes all the time. We're dealing with gray areas, we're dealing with ranges of stuff. When we think about charts and KPIs and metrics and analytics, and it's like... the answer isn't like, " Did the transaction go through or not?" Yes or no? It either did or it didn't. Oh, you typed or you typed something wrong on the form and it generated an error. These are very binary things that are easy to tell if it worked or it didn't. Code complete, QA complete, whatever. It's like, " Well, here's the KPI, it's 68, and it's up 20% from last... is that good? Is that bad? We live in this whole area where it's like, just unpacking, " Well, what does 68 mean?" Well, we have to have a conversation now with the people who are supposed to be using these metrics to do stuff. What are the qualitative ranges around these numbers such that we could actually do something useful and not just shovel metrics at people? Because guess what? No one really wants your machine learning or analytics. Nobody wants a dashboard. They really don't want that stuff. They want to know what do I do different? What do I do the same? What's going to get me a raise? What's going to make me look good? What's safe? There's a lot of different things, but it's usually not, I just want to go see my numbers, my metrics, and even though yes, there's a reason, sometimes we want to get grounded in the reality of what's happening, I need the facts on the ground. That's never the end. There's always that, " Well, what's next? Now what?" It's always right around the corner. And a good team is always designing for what's around the corner, from the beginning, they're not waiting. They're always thinking about, " Yeah, but what am I going to do with that 68?" And when we're talking about predictive analytics and stuff, it's like, " Well, the range of predictions, it's zero to 100." It's like, we have to think about how are we going to design for when it says 10, when it says 50, when it says 100, when it says 80, what triggers an action? At what point would you do something different than you did yesterday when you looked at the dashboard and it said 17? We have to think about 100 different things now and how we're going to deal with that. It's almost a trade- off because you don't have some of the responsibilities of traditional software product management, but you do have all this gray area of predictive intelligence, stuff generating, I mean, look at all this stuff happening now with generative AI right now. It's just like, " The machine's telling me it's in love with me." Writing me sonnets and now it's sad, like all this unexpected stuff that's going on and thinking about how do we design for these scenarios. Even within the enterprise, when you think about business stuff and maybe some stuff that's not quite... and the creative space is that, there's more variables, I think. I think the problem space is actually richer and almost more fun. But I think it's harder too, and this is why you have these dashboard deserts. You have 10, 000 Looker dashboards and 30 of them get used at some of these companies.

Juan Sequeda: I would argue that when you do software development, you have to have well- defined requirements. It's a well- defined known world when you get into, but what it is, data, you will start with some known use cases and things you're going to do, but what's going to happen is, what's around the corner. So, there's a lot of unknowns out there that you need to be prepared for, which makes it a much more challenging problem. Challenging because you have to deal with a lot of unknowns. If you compare the software world, there's less, I mean, I'm not saying there's no unknowns, there are unknowns, but it's much less. I think that the data world is designed to go deal with unknowns there. And I think that's a difference here, which leads me to that. One of the topics we want to talk about was the adoption. It's one of the things that you've been talking about is we need to have focus on the adoption. Why isn't data being adopted, what is bad adoption? How do you achieve good adoption then? I would like to talk about adoption here.

Brian T. O'Neill: What's good adoption?

Juan Sequeda: What we see a lot is that people are doing things with data or is it being used, not being used. We're spending all this effort to go do this, so we are solving some problems, but when we solved the problem, that was it. So, we didn't really think about it in a broader space. How are we investing our time and our money to go deal with, creating all these data products? At the end of the day, I don't want, as we talk about software being shelfware, right? We're seeing a bunch of data analytics also becoming shelfware.

Brian T. O'Neill: Sure. And that was a little rhetorical or flippant when I said," Well, what's good adoption?" And the point I was making there is that you said good adoption. And I say, " Great, well, what's bad adoption right now?" This gets back to how do we measure these things? Let's put a stake in the ground, define what bad is if we think we have it and what good is, what's our target? This is a challenge, is that no one's actually taken the time including the business sometimes to actually quantify this stuff. I think they think that's the job of people with data skills to go do that. And the data people are like, " No, you need to tell me what an improvement is. What's good enough? Five, 100, 1000? I have no idea. That's a business question." It's a you- all question to me, and again, this is why designers, we're trying to get all these people in the room. We do this work together and it's all in there. It's just a lot of the work of design facilitation, of design is facilitation. It's simply extracting stuff out. It's asking the right questions from different perspectives there and surfacing what's probably in there. It's just not going to be written down neatly in a Jira ticket or requirements document or whatever. It's there, but it needs to come out and we have to learn how to ask the right questions to do that. But the low adoption thing, again, we have to define what good adoption is, but I think the classes of problems are utility problems. It's not useful to me, usability problems. It's too hard to get any value out of this thing. The problem was not well- defined, so this solution solves a problem that I don't even understand or I don't need or I don't recognize. You could go on and on. There's trust issues, which is, I don't know how the model came up with this. Or even though it says, " Oh, I've got SHAP values, or I've got these features that tells me whatever." It's like, why do I care what the zip code is? Whether or not I'm going to give this person a mortgage or whatever. We don't look at zip code, and you're telling me that. It's just like, " Okay, I'm not going to use that." Excuse me. One sec.

Tim Gasper: No worries. I love this discussion around adoption because I think it's really helpful to, first of all, define what is good or bad adoption.

Brian T. O'Neill: What is the adoption today? And is that good or bad? A lot of the stuff is that we don't even understand or know where our baseline is today, and we just need to be able to understand that. And you understand that by really figuring out the, what's our status quo, who's involved or not.

Tim Gasper: Exactly. I think we're diving into adoption and we're trying to unpack that. And the other piece that Brian brought up is around trust, ability to trust the data. And I think these two things are interplaying with each other a lot, the adoption and also the trust.

Brian T. O'Neill: Sorry about that. I'm getting over a cold and I had a cough attack there.

Tim Gasper: No worries at all. I think you're setting a very interesting stage here around adoption.

Brian T. O'Neill: I think we covered some of those major areas there. I think the biggest one I could say is that a definition of quality, how will we know if we did a good job with this solution is often very opaque. There are no measurement criteria, there are no KPIs or metrics for knowing this. And if the team doesn't collectively understand this in a common language, then how will we ever know when we arrive at the destination and that we actually push something of value out to the people that we're supposed to create value for. It's really hard. And to me, this is not good for the customers and users. It's not good for the stakeholders. I don't think ultimately any profession, any data professional or software professional really wants to work on stuff that doesn't get used. Eventually, you're going to get tired of like shelfware, as you had said. It's just not fun. It's not good for anybody really.

Tim Gasper: Anyways, a lot of thoughts are swirling in my head right now.

Juan Sequeda: Well, I think this is a good segue. We have our lightning round. Let's go into our lightning round, which we're following up on a lot of, more other things we wanted to talk about. Let me start off with this one. So, just quick, yes or no and provide some context here. Is human- centered design a process you implement?

Brian T. O'Neill: Yes, it is.

Juan Sequeda: Okay. Well, on this, what do you recommend? I know we're jumping ahead now, but what is the best way people to start learning about human- centered design?

Brian T. O'Neill: There's tons of resources on this. I have obviously stuff on my website, designingforanalytics. com. Sorry, my voice. Our timing, right, like cold.

Tim Gasper: No worries.

Brian T. O'Neill: But yeah, I've got stuff on my website that you can check out there about how to go about doing this work. I have a course actually that I teach on this. So, if you go to designingforanalytics.com/course, you can download a free module of that and it will take you actually through all the modules that I teach there. And that's been designed to be relevant to data professionals, taking out the stuff that's absolutely not necessary there. Just like with data science, there's tons of different methods and stuff that, techniques, models, different ways of approaching problems. I've tried to take out all the stuff that's not necessary. So, you're not overloaded with, because like, ultimately the job, you're not trying to become a professional designer if you're a data scientist. I know most of them, that's not their goal. My goal is to give you a set of tools that you could start using now to be a better data scientist, not to become a designer per se. And I disagree with this idea that only designers with a capital D can be designers. And this is a whole debate in the design world, but my feeling is you cannot design solutions. Every choice is a choice. When we're making something for somebody else, we are designing it, whether or not we gave it a lot of thought and care or whether we didn't at all. Either because we don't care or we just don't know what we're not doing, we're still designing it. So, let's learn how to just do it better. Let's learn how to put intention behind the choices and start to think about these things, the non- technical parts of the solution here that are going to be really relevant to making sure the technical stuff ever sees the light of day and gets used.

Tim Gasper: Love that. Second lightning round question here. Can dashboards be data products or two A and then two B, can machine learning models be data products?

Brian T. O'Neill: How do you interface with the model, I guess, right? And again, what does end- to- end mean? I think some of this is in the eyes, it's in the eyes of the data product team. It's in the eyes of the customer. Does that dashboard, is all my job is to change this dial over here manually based on information that comes off this dashboard. You could say, " Well, that's enough." I go from digital world, back to real world. My job is to tune the factory manually using insights from this thing. That sounds pretty end to end me in a lot of ways. So, yes, I think it could be.

Juan Sequeda: Third question. We talked about different roles here. How about data product managers, should companies be investing in data product managers?

Brian T. O'Neill: I think so. I'm already hearing this already. I think this is starting to catch on, again, particularly in the, more in the non- digital enterprise space. There are definitely teams that are starting to do this stuff. If you check out my podcast, you can see some interviews or listen to some interviews with some CDOs that are doing this work. But yes, and several of them have come out of software backgrounds and they're carrying over this. And they're talking about very much the same things that I'm talking to you about today. There's this role called analytics translator, which is, I think on McKinsey, titled role that came up I don't almost like to bring it up because I don't want to see that get used anymore. I don't care for the title because it sounds like something that you do after the fact, because translation is always something that follows something else. And I don't like this idea of, " Well, we do the really technical thing and then we translate it and make it easier or something." Duh. I don't like the title, but I love the work that that community is doing. I think it's very similar to data product management. So, I'm fully on board with the work that they're doing to try to act as an intermediary here between technical teams and business users, et cetera. But yes. Again, the question is, what are your objectives? What are your goals? And if business value from data is your goal, then again you have to get through the adoption piece somehow. And so, the question is, well, whose job is it to get adoption? Whose job is it to make sure that these things get used? And if they don't, to make the requisite changes, to ensure that they start getting used? I don't know whose job that is. I don't really care if they have the title of data product manager or not. I'm just saying it's really hard to win at this game if nobody's using the stuff that we're making. It's just hard.

Tim Gasper: Somebody needs to think about and care about the value and the adoption, which I think leads us well into the fourth and last lightning round question. Towards the end of our discussion today, we talked about data value and articulating what that data value is. Should companies literally track data ROI?

Brian T. O'Neill: Should they track a data ROI? Well, if the chief data officer's job, which is often to define a data strategy, I would assume that they would want to track this so that they can justify their own work, but also help the business understand, well, we're putting all this investment in, what are we getting back? So, absolutely, I think it makes sense to do that. I think the challenge is often going to be, how do we measure these things? And I actually interviewed a great... I love this book, Doug Hubbard, I mentioned him frequently. He wrote a book called How to Measure Anything. And I think this is a great book for so many teams to use. And he teaches us how to do this. And where teams usually get screwed up here is they mix up the idea of measurement and accuracy. And so, measurement might end up being qualitative feedback from some key people. That is the measurement of whether or not we're using data properly, like there's been an ROI in the data. I don't know what it is, but there's a difference there between the accuracy and the measurement thing. But the idea of trying to actually put a measurement on these things so we can start to track things, I think this is really important. I hold my clients to this when I'm doing consulting works and stuff. I don't work with a client where they can't define what an improved state is and it can't just be completely subjective or I'll know it when I see it. It's like, it's a trap. You're walking into a trap with that thing. And my job is actually sometimes to help them figure that out by asking good questions to surface current state now, what's the desired future state, to understand what improvement means. Then we can talk about, well, how do we measure those things together? And a lot of times, they haven't done that, but the activity of going through this is very therapeutic because now, people feel like, I finally know what we're doing and I didn't even know we didn't care about this metric. I had no idea the business is actually really focused on this metric over here because we took it for granted or they didn't think that the people on the ground needed to see those facts. And it's like, " That's actually really relevant to my work. Why am I adjusting all this data from this other system over here when all we're talking about right now is vendor data? I don't need millions of customer records here. That just saved me a whole bunch of time." That's like, " What are we doing here?" So, I think being able to express that clearly is important.

Juan Sequeda: This has been a phenomenal conversation. You are very thoughtful and we've gone through so many details. We're going through our takeaways and let's see how we do because we have a lot that we wrote. We took our notes. So, Tim, take us away with takeaways.

Tim Gasper: All right. Oh, my goodness. So much gold in these mountains. First of all, data products are a complete end- to- end human in the loop decision support system that is so good somebody might pay to use it. I thought that was a very thoughtful definition of what data products are. Maybe what data analytics et cetera, et cetera, products are, because really, they all are about providing value for making better decisions. And you outlined that one of the key things that people think that is incorrect is that people often think that data products are the result of labor, that analysts are creating data products and they're like, " It's coming out the other end of the machine line." And even though there is a reality around, okay, there are humans involved and data teams involved in putting these things together, really it's about the value. It's the fact that there is an asset or a thing that comes out that has value. Would they pay for it or exchange something for it? Is it something worth trading for? And there's intentionality around it. It's not just the byproduct. Our data product is different than analytics products. You would say no, they're all decision support systems and the people are key, the human in the loop. So, third parties, first parties, multiple folks. Is it developers, right? There's a human in the loop somewhere that is a part of the overall system. What is the system not doing? Don't fall in the trap of building something that could do lots of things because usually people don't end up using it to do those things. And later on, you mentioned actually things like dashboard deserts, which I think is a very stark and concerning metaphor for where things can really go wrong. Human- centered design was a very important aspect today. It is not how you display it or taking guesses about what should be there or prognosticating. It's about working backwards from the humans and having the technology meet the human need where it's at, not pushing technology to try to change human behavior. And so, you talked about really collaborating with the team. So, design, not for the team, but with the team. You don't talk to them, you don't know their fears. You're not just starting with technology or starting with the solution. You're trying to start with the problem and own the outcome and own the problem. And you need to play the game of user adoption first. If you're going to skip the users in the loop, you're skipping past the value. And before I hand it over to Juan, who I know you've got plenty of takeaways too. I loved your quote. Shut the F up. Listen for facts and information. You said stop talking and you implied you got to ask questions. I think that's some really good advice.

Juan Sequeda: And follow-up on that. I love how you say, even if you're an introvert that you don't want to go talk to people, it's like, that's fine. Because actually you shouldn't be talking that much. You should just spend 3% of time just asking questions, just listening to that. And then, when you start listening, people are like, " Well, we need K- means clustering on Snowflake." Well, I could just be that data drive- through and deliver that. But actually, this is so thoughtful of this, it's like the stakeholders are trying to speak your language. They're stepping more into your territory. It is your time to actually understand and ask the why, truly understand what is the problem we're trying to solve right there. That was a very, very important takeaway from you there. And then, also Marty Cagan teams owning the problem space, not the solution space. The data teams, they should be owning that problem space. We had this discussion between data versus software products and there's more similarities, right? If you can learn the core basics of data product management, maybe pricing is not as relevant, but marketing, I think marketing, that's something, it is. And actually, from a personal perspective, I think, talking about adoption, you want to be able to go market those data products that you have, things that you can go learn from that.

Tim Gasper: You can have a good data product and sometimes it fails not because the product wasn't good or wasn't valued. You need to market it.

Juan Sequeda: And then, if product managers have to live in this gray world, you talked about that, KPI is 68, is that a good thing? Is that a bad thing? And we really need to think about what's next then now what? Right? So, a good team is good at figuring out what is around the corner and building towards what is valuable and meaningful. And then, talking about user adoption. It's all about understanding what these things mean. What does good adoption mean? What does bad adoption mean? Where are we today? And this is not a problem that the data team figures that out. It's an us, us problem. We have to go figure it out together because at the end, whose job is it to deal with the adoption? Let's go figure that out, if you don't know who's going to be dealing with the adoption and manage that. How did we do? Anything we missed?

Brian T. O'Neill: Are you guys, are you like Chat GPT- 3 clone things?

Tim Gasper: We've actually wrote it. Maybe we'll be out of a job soon because Chat GPT just going to listen. It's going to be like, takeaways, we're going to be like, "If it's Tim's takeaways, it's going to be Chat GPT's takeaways."

Brian T. O'Neill: It's pretty good, pretty good, especially for two guys drinking vodka sodas.

Juan Sequeda: Hey, I'm actually taking notes on my phone and stuff, but I don't know.

Brian T. O'Neill: You guys did a great job. You did a wonderful job.

Tim Gasper: All from you and it was awesome.

Juan Sequeda: Yeah, this is just your content and I think a lot of gold mines in here, a lot of nuggets. Thank you so much. I'll wrap up.

Brian T. O'Neill: Yes.

Juan Sequeda: Throw it back to you. Three questions. What's your advice about data, about life? Second, who should we invite next? And third, what resources do you follow? People, blogs, conferences, podcasts, whatever.

Brian T. O'Neill: Sorry, could you do... let's do them one at a time.

Juan Sequeda: What's your advice about data, about life, whatever?

Brian T. O'Neill: My advice about, well, you're probably not spending enough time with the people you're building stuff for, and we have to get out of the trap of giving people what they asked for. I think if teams start to realize that, that's often a trap, and it's not because they're trying to trick us. It's not because there's any malicious intent. It's just that the need is often not on the surface. We're dealing with much more complicated things in today's world that we're in. And relative to 100 years ago or 200 years ago, these things are much more complicated and it's harder to express the actual needs right on the surface a lot of the time. Caveat, sometimes K- means clustering might be the right way. And we have to be open to the fact that maybe that stakeholder does have a good idea. Maybe because it was a random guess that was right or because there is some relevance there, or because they have a hunch from experience or something like that. Sometimes there's some reason to trust that. And I say that just as much because as a designer, I used to get really threatened when a client or a customer or a stakeholder or a product manager would have really detailed advice about what color something should be or how it should be laid out or whatever. And it's like, " That's my world. You're threatening my space." And as you get older, I think you start to realize collective knowledge is actually really valuable here. I want to just caveat that part a little bit.

Tim Gasper: That's sage advice.

Juan Sequeda: Second, who should we invite next?

Brian T. O'Neill: I just saw Omar was in the thread. You guys know Omar Khawaja at Roche? Have you talked to Omar?

Juan Sequeda: We do. Omar has been a guest on our podcast.

Brian T. O'Neill: Okay. Yeah. Hi, Omar. I saw you throw a message in there about investing in data product mansion. So, that's great. Omar's great. It's not exactly data science and analytics, but again, I think the measurement question, which I see frequently, how do we know if we did a good job? How will we measure success? All these things. Learning how to measure stuff is great. Ask Doug Hubbard to come on your show, How to measure anything. Very different perspective, I mean he knows the statistics and the math about how to do all that stuff. But the first half of his book is not about the math and doing the measurement. It's about asking these questions and separating the measurement and the accuracy thing. I think that could be really useful to data professionals to learn how to do this with their customers, stakeholders, sponsors.

Juan Sequeda: That's an excellent suggestion.

Tim Gasper: I like that. Yeah.

Juan Sequeda: And finally, what resources do you follow?

Brian T. O'Neill: What resources do I follow?

Juan Sequeda: You should listen to your podcast.

Brian T. O'Neill: I never listened to my own stuff for advice because I've already heard that advice that came from me. No. I'd say one of my favorite podcasts is the Knowledge Project with Shane Parrish. I don't know if you guys know that, but I would definitely check that out. That's probably my favorite podcast for thinking about decision frameworks. He's got great interviews, really good leaders and stuff, but I just think it's like, from family to work, there's a lot of really good stuff that he goes into with his guests. So, that's one of the main resources I think that I go to right now. It's less on the data- specific stuff.

Juan Sequeda: All right. Well, Brian, first of all, thank you, thank you so much. Just quick, next week, we're going to have a very special different episode. We're just going to leave it at that. It's going to be shorter. It's going to be live and it's very, very special, different. Couple we get, people are like, " What's going on?" So, everything, everybody will be surprised. But with that, Brian, thank you so much. We had such a very thoughtful conversation. I think we touched so many important topics. I just want to go off and read a lot of the stuff that you have and listen to a lot of the podcast episodes. I think we need to have more of these conversations. So, Brian, again, thank you, thank you, thank you. And as always, thanks to Data. World who lets us do this every Wednesday.

Tim Gasper: Cheers.

Juan Sequeda: Cheers.

Speaker 1: This is Catalog& Cocktails. A special thanks to Data.World for supporting the show, Karli Burghoff for producing, John Loyans and Brian Jacob for the show music, and thank you to the entire Catalog& Cocktails fan base. Don't forget to subscribe, rate, and review wherever you listen to your podcast.

DESCRIPTION

Increasing adoption of data products, by design. Look no further than Brian T. O’Neill’s bio to tell you that’s what he does best.


Our special guest this week knows that low adoption of data products are enterprises biggest enemy. Not just in terms of quantity, but also quality of these investments. Why are teams so often creating technically right, effectively wrong data products? Why do people fail to adopt when it’s them post crucial part of becoming data driven organizations? These burning questions have answers. 


Join hosts Juan, Tim, and guest Brian T. O’Neill on this weeks episode of Catalog & Cocktails.

Today's Host

Guest Thumbnail

Tim Gasper

|VP of Product, data.world
Guest Thumbnail

Juan Sequeda

|Principal Scientist & Head of AI Lab, data.world

Today's Guests

Guest Thumbnail

Brian T. O"Neill

|Founder at Designing for Analytics