Smart Products

Championing AI As A Product Manager With Ali Nahvi From Salesforce

Episode Summary

I'm excited to bring you this conversation with Ali Nahvi. Ali is a Sr. Technical Product Manager for AI and Analytics at Salesforce. During this conversation, he shared his thoughts on championing AI initiatives as a product manager, translating business needs into AI problem statements, and how to positioning yourself for success.

Episode Notes

I'm excited to bring you this conversation with Ali Nahvi. Ali is a Sr. Technical Product Manager for AI and Analytics at Salesforce. During this conversation, he shared his thoughts on championing AI initiatives as a product manager, translating business needs into AI problem statements, and how to positioning yourself for success.

Links

Ali On LinkedIn

Transcript

[00:00:00] Ali Nahvi: to get there, to build that success, success story. You need to fail. And failure is part of the process and sometimes it's not easy for people to see that,

[00:00:09] Himakara Pieris: I'm, Himakara Pieris. You're listening to smart products. A show where we, recognize, celebrate, and learn from industry leaders who are solving real world problems. Using AI.

[00:00:19] Himakara Pieris: I'm excited to bring you this conversation with Ali Navi. Ali is a senior technical product manager for AI and analytics at Salesforce. During this conversation, he shared his thoughts on championing air initiatives. As a product manager, translating business needs into air problem statements. And how to position yourself for success. 

[00:00:37] Himakara Pieris: Check the show notes for links. Enjoy the show. 

[00:00:42]

[00:00:43] Himakara Pieris: Ali, welcome to Smart Products. 

[00:00:47] Ali Nahvi: Thank you so much Ima, for having me 

[00:00:49] Himakara Pieris: to start things off could you share a bit about your background and how you got into AI product management? I.

[00:00:58] Ali Nahvi: I'm an accidental [00:01:00] product manager. I started my career journey with business intelligence and I guess it was around 2012 or 13. It was the first time I've heard the board data science. Before that we simply called it math. And I love the idea. I decided to move from BI to ai and that was the major figure for me to come to us do a PhD.

[00:01:27] Ali Nahvi: And I did my PhD in application of ai m ml in the context of project management. And after that I started as a data science consultant. In a consulting company and yeah. And, and, and one day out of blue my roommate from grad school called me at the time who was working at Amazon and he told me that, Hey, I mean, we have this thing in product manager and I think you should, should become one of them.

[00:01:56] Ali Nahvi: I did some research and very quickly I [00:02:00] also. I've got the same impression that, well, this can be an ideal job for me. I love helping people. I love solving business problems. I love ai. And I also love business development and communication and being around people.

[00:02:16] Ali Nahvi: So I thought, well, that might not be a bad idea. So I joined iron Mountain in my first for like manager role. And then I joined another company after a while Cengage, which was mainly focused around online education. And recently I've joined Salesforce as a senior technical product manager for AI analytics.

[00:02:45] Himakara Pieris: What is the primary difference you see going from, bi to data science, to ai as a product product manager? Do you need a different skillset? Are those, BI skills transferable across all these verticals?[00:03:00]

[00:03:00] Ali Nahvi: Yeah, business intelligence definitely still helping me a lot. 

[00:03:04] Ali Nahvi: And from data science perspective, I'm one of those PMs who thinks that PMs should be technical and have the ability to have that super technical discussions with the teams especially in data sciences space. In data science, in AI ward, understanding the problem, understanding business requirements is, in my opinion, is solving half of the problem.

[00:03:31] Ali Nahvi: If you get there, if you can really digest the problem statement and have the ability to transfer that into a data science language then you are a really good PM and, and to do that for me, Having that technical background around data science have been extremely helpful. 

[00:03:51] Himakara Pieris: What would be a hypothetical example for translating a business requirement into data science or machine learning language?[00:04:00]

[00:04:00] Ali Nahvi: Let's say I'm assigned to work with a stakeholder in sales or marketing. And I sit with them, set up a call and say, Hey, what's your pain point?

[00:04:12] Ali Nahvi: And they say, okay, I wanna increase sales and productivity. And so I would say, okay so can you explain what you're doing on a day-to-day basis? And they, they explain, this whole sales process that they go through from lead generation to sales calls to closing deals, and I might be able to find some opportunities there.

[00:04:36] Ali Nahvi: To use AI to help them to do a better job. For example, the lead generation piece. Maybe you don't need to call all the customers, all, all the leads coming to your way. Maybe you can optimize that. Okay? But then you need to build a bridge. Between that really weight business problem into a very solid, robust data science problem.[00:05:00]

[00:05:00] Ali Nahvi: The business requirement doesn't give you anything like dependent variable, independent variable, the data structure, anything like that. So as a product manager, it's my job to help the team to kind of define that problem. And another thing that I believe that, that, that's why I think data, data science, product managers should be technical, the feature engineering.

[00:05:22] Ali Nahvi: That's extremely delicate thing to do in my opinion. It's, it's something that where you tie business with science and you really need to have good understanding about how data scientists would do feature engineering. And at the same time, you really need to have a robust understanding of how business operates to, in incorporate all the important features in your feature engineering and make sure you capture all the important elements. 

[00:05:51] Himakara Pieris: You talked about, doing these customer interviews or user interviews looking for opportunities, these might be data, sort of [00:06:00] curation opportunities or recommendation opportunities or clustering opportunities or, or what have you, that sort of.

[00:06:09] Himakara Pieris: Buried in, in the story that they're saying. 

[00:06:11] Himakara Pieris: You identify that and then you transform it from there to a, a problem statement that machine learning and DataScience folks can understand. Right. Could you talk me through the full workflow that you're using? So what are the key steps? So sounds like you're always starting with a use interview.

[00:06:28] Himakara Pieris: How does the rest of the process look like? 

[00:06:31] Ali Nahvi: Let's go back to that sales problem again. For example, on the late generation, they say that, okay, we generate 2000 leads per day, but we can only call. 500 of them. So the, the lead optimization problem that I mentioned before that would pop up or on the sales calls, they say that we have limited number of sales mentors who can help salespeople.

[00:06:54] Ali Nahvi: So maybe we can leverage AI to listen to some of the recorded calls and provide some [00:07:00] insights. So these are all hypotheses that could come up and I will write them down, all of them as potential initiatives. And then I would ask these questions from my stakeholders all the time. Let's say we are six months from now, a year from now, let's say we are done with this and we build this model that is a crystal.

[00:07:20] Ali Nahvi: Al can tell you this lady's gonna make it, this lady's not gonna make it. How would you use it in your day-to-day, how it's going to change your workflow? Okay. And, and based on that, I, I try to basically come up with an estimate, ideally a dollar value around the, the potential added value that initiative can have.

[00:07:44] Ali Nahvi: And then I would work with my team engineering managers, data science managers, try to understand visibility, data accessibility, data availability, and level of effort. , and based on that, I create a diagram [00:08:00] in, in one axis we have value. In the other we have level of f effort. And when you build something like that, it, it, it would immediately pop up and, and the, the, the high highest priority initiatives would, would show themselves to you.

[00:08:19] Himakara Pieris: Sounds like you're identifying opportunities and then solutions, and then you are going through an exercise of validating these solutions. Right? And then it moves to the implementation part. I want to go through and discuss how if it is different from a traditional software development process.

[00:08:41] Ali Nahvi: Absolutely. There are major differences between data science and software engineering and lots of intersections. So intersections are obvious. They both need coding. They both need infrastructure.

[00:08:54] Ali Nahvi: They both need data. But there is a, a delicate [00:09:00] difference between them that. It's, it's, it's kind of hidden in the name of data science as well. It's science, it's not engineering. So element of uncertainty is there. All of these initiatives that we came up with, they are just hypothesis. We have a hypothesis that based on the current data, based on the current evidence, we might be able to build a prediction model to meet whatever requirement that we have in mind.

[00:09:26] Ali Nahvi: But for there might be a chance that, that, that hypothesis. Wouldn't be right or even there might be a chance that we build a model, but it's not really usable or explainable for the user. So these types of uncertainties I think significantly different. Differentiate data science from software engineering work.

[00:09:55] Himakara Pieris: How do you account for and plan for the probability of [00:10:00] failure? There is a probability that your moral can't make predictions with enough level of accuracy. How do you put in guardrails to make sure that this kind of failure, probability of failure is accounted for and planned for in that process? 

[00:10:15] Ali Nahvi: That's a fantastic question. And I have two mitigation plans for that one on the soft side of the business and on the other is quantitative.

[00:10:28] Ali Nahvi: The quantitative side is basically, I look at what they have right now. Okay. Do they have any system in place? Let's go back to that late generation problem. Is it 100% random? So if it's 100% random, then I just need to beat the random mistake which, which is not a super challenge. But on the other hand if they have a system already that is able to predict things by 90% true positive, I'm [00:11:00] not gonna touch that. I don't wanna yeah, compete with that because the chances of success is, is not really high. 

[00:11:08] Ali Nahvi: I always try to kind of educate my stakeholders on how data science work and I, I try to show them some stats around the failure of data science.

[00:11:18] Ali Nahvi: Nine out of 10 would fail for different reasons, and I, I try to kind of be honest about some of the pitfalls and shortcomings of data science in advance and say, Hey, this is just the hypothesis we have. It may not work.

[00:11:36] Himakara Pieris: Do you have a probability of success for each experiment that you're running and parallel track number of experiments to make sure that you have something that's functional at the end, 

[00:11:48] Ali Nahvi: at least at the qualitative level. Yes. I try to. Qualitatively capture based on the discussion I have with engineer managers and also my my own [00:12:00] experience and my own feelings about the problem and the evidence that I see. Ideally, we should be able to get into some quantitative level. Even if it's, it's not possible, you still can do it qualitatively. 

[00:12:16] Himakara Pieris: Sounds like this. Structure of it makes it difficult to follow something like agile as part of the development cycle. How does the development, workflow, or the development cycle look like in machine learning projects 

[00:12:31] Himakara Pieris: for you? 

[00:12:33] Ali Nahvi: Well to be honest, I think all these, these, these things would, would happen before we start development and I think we should be very picky.

[00:12:44] Ali Nahvi: Developing What? That's why I set up all these rules for myself because there are tons of business problems out there. Okay. One rule that I used to set up in my former company that if we cannot put a dollar value on an initiative, we don't do it. Even if it's a strategic, if [00:13:00] and if it's intuitive that, that, wow, it has lots of value.

[00:13:04] Ali Nahvi: We have to be able to put a dollar, we have to be able to quantitatively measure that because there are lots of opportunities out there. We can do lots of things and we have limited resources. So I, I try to be some sort of a goalkeeper for the team so when things get into that developmental stage, none of these questions would come up again.

[00:13:25] Ali Nahvi: And, and then when things get into that developmental stage, we try to follow Agile as much as we can. Many people say that Agile is not really working for data science. My perspective is a little bit different there. I still think that majority of the things we do in data science, they have a engineering function.

[00:13:46] Ali Nahvi: I mean lots of pre-processing, post-processing, lots of network is just simple data engineering. And the data science piece model development is maybe 20%, 30% of the [00:14:00] whole work. For that piece, definitely we need to have some sort of a contingency plan in case things won't go as we expected and we need additional time to try different models and different iterations of models.

[00:14:15] Ali Nahvi: Beside that, I'm still loyal to, to Agile

[00:14:18] Himakara Pieris: you talked about data science modeling component. You talked about engineering component. So this sort of brings me to thinking about the structure of a AI machine learning data science team within a product organization.

[00:14:36] Himakara Pieris: So what are the different approaches to put these teams together? What are some of the roles and responsibilities and what advice you have when you think about team formation? 

[00:14:47] Ali Nahvi: Well in my carrier, I've been working with teams in, in totally different structures. But the thing that worked for me, I cannot say that is ideal [00:15:00] across all companies, but something that worked for me is.

[00:15:05] Ali Nahvi: Some sort of a separation of church and estate. I want product and engineering to be very separated from each other because there's obviously some, some conflict of interest going on there. As a product manager, when they ask me to estimate a point for a. Story, I even unconsciously I tend to go with one or two all the time.

[00:15:28] Ali Nahvi: But a data science manager may, may feel totally, totally different about that, which is totally, totally understandable. Cause they are eventually have to deliver. So at, at the high level, I think they should be separated. And then within a data science team ideally it'll be great if you have, if you can have some, some data engineers.

[00:15:50] Ali Nahvi: To help with some of the initial transformation. Building some of the pipelines help with centralization, having data analysts to help with some of [00:16:00] the pre-processing cost processing, doing some analysis and helping data scientists to build some hypothesis and validating them, and eventually data scientists to help with lots of statistical modeling, machine learning and bringing explainability to the models. 

[00:16:17] Himakara Pieris: I think there is a massive amount of interest in embedding AI and machine learning into products across the board. So in this context, we have a large number of. Establish software companies, that are profitable, serving clear market needs. And we have product managers inside these organizations who are now looking to find the optimal path to adopt, and experiment ai within their own products. What do you think is the best path? 

[00:16:48] Ali Nahvi: It's all about customer. And customer should be in the center of what you build and what you design. So we need to understand the customer pain points [00:17:00] and for, for doing that, our different methods If, if you have limited number of customers, I mean, for example, if you are working in a big, big corporation and you have one product, but you only sell it to 12 giant banks, for example, in that case you have the privilege of going to customers and, and talking with them and figure out what's the pain point.

[00:17:25] Ali Nahvi: But if you have a product that is kind of bigger in a scale and it has for example 100,000 users. Then, then you have to look at the usage data, try to understand some of the pain points that, that your customer is facing using your product. That piece is not really going to change.

[00:17:48] Ali Nahvi: Okay? But the thing that would be different from software engineering to AI ml is when you want to build those [00:18:00] features and capabilities. Let's say I'm in a software company that didn't have a data science AI ML practice. Now I wanna add a feature there. To my product. And and, and for that I, I'm going to start with some of the architectural challenges.

[00:18:19] Ali Nahvi: For that I need data. Okay? In, in software engineering, especially the companies who have, have been a legacy of software engineers, they have robust processes for data distribution and data governance, which is a good thing. But when you wanna build data science models, sometimes you need all the data.

[00:18:38] Ali Nahvi: I remember I was having this, this conversation with one of the solution architects that I said, Hey, you mean we need data from this database, this database, this database, and we need data from these tables. And it was like, okay, but can you specify which columns? And I was like, I don't know. And he was like, what?

[00:18:58] Ali Nahvi: What do you mean you don't know? [00:19:00] And I don't like, I mean, I need all the data. I mean, I, I have no clue which I want till until we don't, we won't examine it. You know, we won't know which data we need. And and yeah, that was kinda funny problem that happened. So architectural challenges definitely would be one major challenge there.

[00:19:19] Ali Nahvi: Another challenge is human resources mean data science resources are expensive. Some, sometimes company cannot see the immediate value to higher. 10 new data scientists. And sometimes you kind of need to align leadership and business with your vision and say, Hey, I mean, if we do this, it can bring this value.

[00:19:44] Ali Nahvi: So try to convert that to some sort of a benefit cost analysis problem for them. But another challenge, as I mentioned earlier, is uncertainty in a company like Salesforce. Failure is perfectly fine. I mean, you can fail when you do AI because [00:20:00] it, they've been doing AI for so many years and I mean, Salesforce users only heard about stories because they're using the successful features.

[00:20:13] Ali Nahvi: But to get there, to build that success, success story. Yeah. You need to fail. And failure is part of the process and sometimes it's not easy for people to see that, especially when they were building softwares. The way software features would fail, would be totally different from data science features.

[00:20:35] Ali Nahvi: It won't completely fail, but in data science, you have a chance to completely fail and come up with no outcome that's a possibility. So these are some of the challenges that could be 

[00:20:49] Himakara Pieris: In nutshell, you have architectural challenges because the way you perceive data where you store your data and manage data, go on data could be very different.

[00:20:58] Himakara Pieris: And [00:21:00] you often don't get unrestricted access to things. What would be a way to solve that kind of challenge? 

[00:21:09] Ali Nahvi: So ideally, I think data platforms should be designed as a product, not as engineering solution. There is a certain difference between a product and engineering solution, and that distinction is experience data platforms have persona have users, and those users are human being. Although they can code, although they are technical, but they are still human big and there is an element of exper experience that needs to be taken into consideration.

[00:21:39] Ali Nahvi: And I've seen lots of, and lots of data platforms, lots of system designs and solution architectures that completely forget about that piece because they say, okay, the persona, our person is data sign. They can code. Yeah, of course they can code, but, but it's still, I mean, they don't wanna struggle. With that.

[00:21:59] Ali Nahvi: So [00:22:00] ideally we should have some sort of a platform that me as an analyst, as a data scientist, as a data engineer can be able to choose the data tables I need. I I should be able to have access to some sort of a business glossary to tell me what each column in each data table would really mean.

[00:22:20] Ali Nahvi: That's the ideal data platform to me. I mean, element of experience, element of customer journey, all those things would be considered.

[00:22:32] Ali Nahvi: But things are not ideal. So what I've done before and helped me that I started with some champions.

[00:22:42] Ali Nahvi: Because these companies are really big. Some of these software companies can be very big and they have lots of teams, different databases, tons of solution architects, and among those folks, there might be some people who really get that. There might be some business leaders and also solution [00:23:00] architects who had some exposure with data science before.

[00:23:02] Ali Nahvi: So one thing you can do, start building something small with them and showcasing that capability to others. Telling that success stories to others and telling that success story to leadership. To gradually build this culture, cultivate this culture around centralization and around the proper data platform for data science, and eventually conquer different business units within the company and, and finally get to that automate centralized data solution.

[00:23:35] Himakara Pieris: Another thing you talked about was cultural challenges, not having a cultural failure at some of these companies . How would you overcome cultural challenges?

[00:23:47] Ali Nahvi: In companies like Salesforce, AI has been very well understood even on the business side, even within the non-technical folks and whenever new capability. [00:24:00] Comes up in terms of ai. People look at it as an opportunity, but, but I've, I've been in, in companies that the narrative was quite a bit different.

[00:24:10] Ali Nahvi: . So couple of things I think can, can happen there. First the alignment with leadership. Understanding the priorities would be a key, identifying low hanging fruit, and by low hanging fruit, I mean AI initiatives that doesn't require that much building on your end.

[00:24:29] Ali Nahvi: Things that you can leverage technology to do that with the current resources you have. So if you, if you have 10 software engineers in your team and don't have any data scientists, you still can do ai. So, I mean, there are multiple tons of services offered by AWS G C P Azure. Lots of companies like DataRobot Data, aiq Salesforce understand those things that can be leveraged.

[00:24:58] Ali Nahvi: So you can, you can still leverage the [00:25:00] skill sets of the people you have by technology in form of pay as you go, rather than bringing 10 data scientists on board and then wanna show, show value. So you can start small with current resources you have leverage technology, prototype it again, go back again.

[00:25:17] Ali Nahvi: Come up with a story, tailored success story, cultivate the culture, and then you can attract then when, then, then when you have business team, there's attention, then you can ask for budget for building things that you cannot easily get from out of the shelf. 

[00:25:36] Himakara Pieris: That sounds like a good approach to overcoming human resource challenges, 

[00:25:40] Himakara Pieris: You can go from like more of a pass offering, pay as you go, try it out using off the shelf products get some traction, show some value, and use that to justify acquisition of new resources to build out your team and go from there. 

[00:25:56] Ali Nahvi: Yeah, 

[00:25:57] Himakara Pieris: Let's say you are a product manager at a material [00:26:00] software company and you have identified number of areas that you could effectively use AI and machine learning to possibly deliver unique value, and now you had to go and present this to the leadership to get their buy-in.

[00:26:15] Himakara Pieris: What would you recommend as another thing they should do to secure buy in? Assume that the leadership is not very current with all the latest in AI and ml and they are possibly even more exposed to the challenges and pitfalls than the opportunities. 

[00:26:35] Ali Nahvi: You need to propose something with whatever we have that would be able to generate revenue. When you do that, when you do that couple of times, then, then you earn the trust and then you can do other stuff. And I think that rule of thumb I can confidently say that it's consistent across all the businesses because they wanna make money.

[00:26:57] Ali Nahvi: They don't really care about AI or [00:27:00] technology. They care about their own business. They care about their own customers. And that, that should be the mindset you should have. You should always ask that. So what question from yourself? Let's say we built this. So what who is going to get benefit from is what's, what's gonna change?

[00:27:16] Ali Nahvi: After building this. So I think that mentality helps a lot with getting alignment with business priorities, building something that building some sort of a prototype to showcase to leadership to attract their attention with limited amount of resources and cost and yeah. And eventually a sustainable data science development.

[00:27:44] Himakara Pieris: Any success stories or failure stories that you'd like to share as 

[00:27:49] Himakara Pieris: well? 

[00:27:50] Ali Nahvi: So in terms of successes, stories when I was with iron Mountain my former [00:28:00] company we wanted to Basically build some workflows, processes to use AI and based on that be able to parse some of the documents.

[00:28:18] Ali Nahvi: And this is no secret. I'm not giving any insight. Inform, this is our mountains business. You can find this on their website. And one of the challenges that, that I faced immediately when I got the job. Oh gosh. We need lots of AI to do this. And we are not an AI company. At the time, they hired 12 machine learning engineers.

[00:28:42] Ali Nahvi: But I've done some quick estimation, one of the engineering managers, and we learned that even if we had 100, we cannot deliver these things in a year. So I, I did some research and I found a company who had all that AI and that, that company's name [00:29:00] surprisingly, was Google. So we reached out to GCP folks and talked about the problems.

[00:29:07] Ali Nahvi: I talked with our G C P representative and we found that majority of the AI components, the features that we want they already have, and we can start using them immediately and. The, some of the things that the G CCP didn't have in their offering, then we can leverage our own resources to do that. So eventually the cost of development significantly decreased from zillions of dollars to a couple of millions.

[00:29:38] Ali Nahvi: And also the time and also the quality of the de deliverables significantly improved.

[00:29:46] Himakara Pieris: That sounds like a good success story. Are there instances of failure that you can share as 

[00:29:50] Ali Nahvi: well? 

[00:29:51] Ali Nahvi: So yeah, I had some ambitious ideas before when I was in consulting. [00:30:00] We had, lots of legal documents that was manually passed by folks at the time. And I've, I've tried to showcase some of the AI email capabilities to leadership to build to basically help with parsing some of those legal documents for our customers.

[00:30:27] Ali Nahvi: And the prototype that I put together, it was fantastic. Honestly, I was doing a great job, but I didn't think about the scalability problems and the volume of data. Again, I was thinking about this as a data scientist, not as a product manager later, and it was one of the most challenging things that I've learned in my life in a very hard way.

[00:30:49] Ali Nahvi: That when you are a data science product manager, you have to think about this whole system end-to-end data science. Ai, ML is just 10% of it. 90% are other things that [00:31:00] you should consider in your plans. And and I didn't plan for that. So it it failed. It, it failed. It was a catastrophe. 

[00:31:08] Himakara Pieris: So essentially think about how you productionalized something, how you scaled something that goes beyond showing value in a sandbox environment.

[00:31:17] Himakara Pieris: That sounds like the biggest takeaway there. 

[00:31:19] Ali Nahvi: Exactly. Yeah. 

[00:31:21] Himakara Pieris: Ali, thank you so much for sharing your insights today. Is there anything else that you'd like to share with our audience? 

[00:31:27] Ali Nahvi: Thank you so much, Hima. no, I really enjoyed talking with you. Also had a chance to listen to the previous podcast and I loved it.

[00:31:36] Ali Nahvi: I really appreciate what you're doing. I think this is gonna help a lot with, with data science community and I hope to see more and more folks to become data science product managers.

[00:31:47] (Outro)

[00:31:47]

[00:31:50] Hima: Smart products is brought to you by hydra.ai. Hydra helps product teams explore how they can introduce AI-powered features to their products and deliver unique customer value. Learn more at www.hydra.ai