Trust, context, action: Leena Doshi on designing Surveys that drive decisions

Description

In this episode of The Curiosity Current, hosts Stephanie and Brianna Boyer talk with Leena Doshi, Senior Manager of Research at Electronic Arts (EA), whose career spans EA and uShip. Leena explains how insights teams differ from analytics by creating new data to answer human questions, and why framing problems as puzzles (one solution) versus mysteries (ambiguous, multi-solution) is essential in fast-paced industries like gaming. She shares lessons from leading a 10,000-respondent global study, why trust is the real driver of influence, and how her “one new thing per quarter” rule keeps her ahead of technology shifts. Leena also highlights the role of AI in processing unstructured data, the importance of sharp survey objectives (“you can’t boil the ocean”), and why insights must evolve from reactive validation to proactive strategy guiding tomorrow’s decisions.

Leena - 00:00:01: 

We live in a world where we almost have too much information, too much data at our fingertips. So, it's really important to review what you already have before you start adding data to that existing pile.

Stephanie - 00:00:15: 

Welcome to The Curiosity Current, the podcast where we dive deep into what's shaping today's trends and tomorrow's consumers. I'm your host, Stephanie, and I'm so glad you're joining me. Each episode, we tap into the minds of researchers, innovators, and insights professionals to explore how curiosity drives discovery and how discovery drives better decisions in an ever-changing market landscape. Whether you're a data enthusiast, a strategy pro, or like me, just endlessly fascinated by human behavior, this is the place for you. So get ready to challenge your assumptions, spark some fresh thinking, and have some fun along the way. Let's see where curiosity takes us next with this brand new episode. 

Stephanie - 00:01:00: 

Welcome back to The Curiosity Current. Today, I'm joined on the hosting side by Brianna Boyer, Senior Director of Solution Strategy at AYTM, and a longtime friend and colleague of mine. We've been working together for twelve years. Brianna, thank you so much for joining today. And even more exciting, on the guest side, Brianna and I are joined by Leena Doshi, Senior Manager of Research at Electronic Arts, where she leads the charge in market research strategy to inform business decisions and elevate customer experiences. Leena brings a wealth of experience from her time at EA and U-Ship, where she has been instrumental in building market research functions, leading global research projects, and leveraging data to drive innovation and growth. In our conversation today, we'll explore the evolving role of market research in a fast-paced industry like gaming. We'll talk about the differences in solving data mysteries versus data puzzles and balancing insights that honor complexity and individuality in gaming audiences with universal truths that can drive action. We cannot wait to dive in. Leena, welcome to The Curiosity Current. We're excited to have you with us today.

Leena - 00:02:12: 

Thank you. I appreciate it. I'm excited to be here.

Stephanie - 00:02:15: 

Wonderful. Well, to kick us off, there is a question that we often start with. In this case, you know, you've had such an interesting career so far spanning several different industries, from gaming at EA to logistics at U-Ship. What initially drew you to market research? How'd you get into the field, and how has your journey evolved over the years?

Leena - 00:02:34: 

So, it's kind of a funny story because I literally just stumbled across marketing research as a function. I was out looking, actually for MBA programs. I was in the Dallas-Fort Worth area at the time, and I was just wandering around looking for MBA programs. And I stumbled across the Master of Science in Marketing Research at the University of Arlington and I met with the advisor, and he told me all about the program, and I was in. I was hooked. I loved the idea of being able to combine data and, kind of, the human story. And it's been amazing ever since. And kind of over the span of my career, the passion hasn't changed but I think my role in the process definitely has changed a little bit, going from more of the "doer", like in the weeds, getting into the research where now it's more the strategic leadership, thought leadership role.

Brianna - 00:03:29: 

I love how we all just stumble into this industry.

Stephanie - 00:03:32: 

It's such a common theme. Yeah. I love it.

Brianna - 00:03:35: 

Yeah. We've all got our own journeys of stumbling, so it's always cool to hear. So with your extensive experience in shaping research strategies and driving customer-centric decisions, how do you define consumer insights in today's rapidly changing market landscape, especially in industries as dynamic as gaming and shipping?

Leena - 00:03:51: 

So I really thought long and hard about this question and how I wanted to answer it and I think that what is "consumer insights", I don't think that part changes. Humans are complicated and can be hard to predict, which is kind of why we study them and why we have jobs. So I think as researchers, in a nutshell, we study human needs and wants and perceptions and behavior to better understand them. And we also kind of sit in a unique analytics position because we are able to create data to solve problems. So for example, at EA, we have large analytics teams, but their jobs are all about analyzing data that already exists. So they are looking at telemetry data, marketing funnel data, metrics, stuff like that, and analyzing it whereas researchers or insights professionals, we have the power and ability to go out and create new datasets to help solve these human problems and challenges. So, I think the core of what we do doesn't change, even with the rapidly changing technology and industry environments. What does change, though, is how we are going to meet these challenges, these human challenges. So the tools in our toolkit are rapidly changing as technology continues to evolve, and we're going to have to evolve our methodologies with it.

Stephanie - 00:05:12: 

Absolutely. And we're going to get into that a little later in the discussion. So I'm looking forward to unpacking that a little bit more. Before we do that, there's kind of a topic that I wanted to chat with you about. And so to start, I think anytime you move to a new org or a new industry, there's change. You're learning and adjusting to new things. There's something about your journey that I have to imagine, you know, there are some pretty big differences in consumer behavior and even research needs between an industry like logistics versus one like gaming. And this is where I went down a little bit of a rabbit hole last night thinking about games like College Football versus Apex versus The Sims and how complex and different your audiences can really be in gaming. And so I wondered if you could talk a little bit about how you've adapted your approach to insights for the gaming industry. And this is kind of a two-parter: how do you navigate the complexity of audiences in gaming to find that balance between honoring the diversity but still pulling insights that are generalizable enough to make an impact?

Leena -00:06:19: 

It is not easy. That's the answer. It is hard because you're absolutely right. We've got such, our games are our products, and we've got such drastically different products. You can go from Apex Legends to The Sims or to any of our sports games and like you said, the audiences are completely different. But a lot of it is really just recognizing that they are different and not going in with the assumption that, "Oh, well, I can just, I'm sure the Madden player is close enough to The Sims player” and so it's really just acknowledging that there are differences and as you do the research, making sure that you're accounting for those differences and not making assumptions about what is or isn't true, because that happens a lot. Folks get into the habit of just making assumptions about who the player is or what they do. And so as the insights professional, it's really important that we're able to take a step back and help the business have that awareness about the distinctions and the differences in the player bases. So I think, so you were asking about how, you know, we customize it and I think it's more about working with the business and making sure that they are not lumping people together that they shouldn't be. It's more about the differences or how we interact and how we help the businesses, because each of the products kind of work a little bit differently. We have to customize that to each of the teams essentially.

Stephanie - 00:07:42: 

It makes sense. Are there ever times where you think about, like, or do research where you're thinking about gamers a little bit more monolithically? Or do you find that you pretty much always need to be thinking at the product level and the audience for that particular game?

Leena - 00:07:58:

 It kind of depends on what you're talking about because, obviously, our products and our different games are their own brands, and it's important to keep the players kind of in their own buckets for each of the products. But we're also EA as a brand as well. And so it's, which is hard because EA as a brand is like almost the entire population. It's just complicated, and you have to keep all of these things in mind when you're talking about it. So, it really just depends on the objectives and what you're after and whether you're talking about the EA as a brand or you're talking about what are the products, and you just have to kind of customize your approach depending on what it is you're after. Like anything else we do, the answer is "it depends."

Stephanie - 00:08:39: 

"It depends”, very reliable answer.

Brianna - 00:08:42: 

A classic one, for sure. So in your recent YouTube mini-series on solving data mysteries versus puzzles, which was fantastic, I really enjoyed watching that and recommend everyone go check it out, you discussed how data can be approached differently depending on whether it's a mystery or a puzzle. Could you elaborate on how understanding this distinction changes the way market researchers should approach problem-solving?

Leena - 00:09:03: 

This was, I loved doing this, that little YouTube series. It was super interesting. I got into great conversations about it and in a nutshell, a puzzle is relatively easy to solve, you just find all of the puzzle pieces and you put them together, and then you have your answer. A puzzle typically has a single solution, or in research terms, we go conduct it. You have a problem, you conduct a study, and then you have your answer. So, that's typically a puzzle. On the other hand, you have a mystery and a mystery is complicated, ambiguous, and doesn't necessarily have a solution or have a simple solution or even a solution at all. Because of the ambiguity and complexity, it's best to approach a mystery by gathering and reviewing all of the existing data that you have on the problem. So, we live in a world where we almost have too much information, too much data at our fingertips. So it's really important to review what you already have before you start adding data to that existing pile and this is especially true, I think, when you're leveraging any kind of technology. Like all of our other tools, technology is a tool and like any other tool, it's only as good as its operator. So you want to make sure you gather, if you happen to think you have a mystery on your hand, the way to approach it is gather everything you have and analyze that before you do anything else versus a puzzle, which is you have a problem, create a study, put all the pieces together, and then you have your answer.

Stephanie - 00:10:29: 

How did you develop that framework? I'm so curious because it's super intuitive, like, as you talk about it, I'm like, "Of course." And I can map a lot of experiences to it, but I have never articulated that, and I've never heard it articulated.

Leena - 00:10:43: 

So the idea came from a book that I read called Intelligence for an Age of Terror by Gregory F. Treverton, and that book is all about data and security at the national level. So he was the chairman of, basically, I've forgotten exactly the name of what he was the chairman for, but so he wrote this book that it was all about, like, data and technology. So, for example, he said, he talked about "why did nine-eleven happen?" in terms of data. Like, who had what data and how it, you know, crossed or didn't cross, left hand didn't know what the right hand was doing kind of thing. And in that, and so I read that book, and I was like, "Wow. That's really interesting”, and so I took a lot of the lessons he learned from data at the national security level and then said, "All right. How does this apply to research? How does this apply to the data and the analytics that we do every day?" And so that's one of the ideas that I came up with from reading this book is that, and he's the one that coined, like, the "puzzle" and the "mystery," so I cannot take credit for that. But I took all of the learnings that he shared in that book and then applied it to insights professionals in research. And one of that was, say, "Okay, if you define something as a puzzle, then just go out and do the re-execute the research project, and then you have your answer versus the mystery, which is you likely have way too much data already, and make sure you're analyzing and collecting everything you have before you go and add more data by doing another research study." So, that's kind of how that came about.

Brianna - 00:12:12: 

Very cool. It's so interesting. It's fascinating also that you were able to take this government-level perspective on data and translate it to consumer insights.

Leena - 00:12:22: 

For anybody who's interested, I highly recommend the book, but it is not an easy read, like, it's tough, like, it's so detailed and so technical. It's a tough read, but it's fascinating because it's all about how the federal government uses data and all the different - the CIA and the FBI, and how they all do it in different ways and how they work together. And some of the challenges I think are similar to us as insights professionals because I know, I mean, EA's got over 10,000 employees, and one of my biggest challenges is always how to get the right person the right data at the right time. And then he talked about that a lot, too, in the book. So -

Stephanie - 00:13:00: 

That's super interesting. So you, sort of, have hinted at this so far along the way, but I'm curious just to dive into it directly. You know, given the diverse methodologies that are available in market research, from in-home ethnography to focus groups and quant surveys, how do you see the balance among the various methodologies we have access to evolving in the gaming space where I have to imagine the attitudes and behaviors and even those audiences are changing kind of rapidly?

Leena - 00:13:28: 

My philosophy for technology, which includes the huge explosion of AI, is that technology needs to focus, I hope that we are able to allow technology to focus on what it's good at, which frees up the humans to do what they're good at. And typically, what the AI is good at and what the human is good at is not the same thing. So that's why it makes a really great partnership but I think the one of the easiest examples is unstructured data. So, we let the AI take the unstructured data and create summaries and create categories of themes and stuff like that but it takes the human to go back in there and review all of that, human, obviously, is super difficult and time-consuming for the unstructured data, but it's easy for the AI. And then the human can come in and do what they're good at, which is looking at the data, doing critical thinking and problem-solving, and being able to come up with a final deliverable for that unstructured data.

Stephanie - 00:14:28: 

Makes sense. And it's like contextualization, a big part of what the human can do, too. Right? That's not something that AI is going to be able to easily do, especially the contextualization of, like, your company and your values and what, you know, your goals are, things like that.

Leena - 00:14:44: 

So, like, a great example in gaming is we've got games where there are words that, like, the AI thinks is bad, like "murder," "death," "kill," for example. Right? And it's actually a good thing, not a bad thing. It's just a good example of why it's so important to still have the human involved in this process because humans are complicated, humans, you know, are not easy to solve, and so it's still going to take that when working with the AI to produce the best output and the best deliverables.

Stephanie - 00:15:16: 

Makes a ton of sense.

Brianna - 00:15:18: 

So you've been instrumental in building market research functions from the ground up, notably at EA and U-Ship. What do you think are the core pillars of establishing a successful, scalable research function in a large, complex organization? And then how do you ensure that it adapts to those rapid technological advancements?

Leena - 00:15:35: 

It's a great question. So I have learned, and in some cases, the hard way, the key to building a scalable research function really has nothing to do with the research itself and it's everything to do about understanding the needs of the business and your stakeholders and how it operates, for example, do people want their findings in just a report that you hand out? Do they want to have a presentation and have it presented to them, or do they prefer to use vendors? Do they want to do everything in-house? So all of these questions are really how you build an effective and scalable research function. The research itself doesn't really change, but it's how that function interacts with the business, I think, is the important piece. And especially in large organizations like EA, flexibility and adaptability and trust is key. So, I think the adaptability and the flexibility part is pretty self-explanatory, so I'll elaborate a little bit on what I mean by the "trust" piece, which is also essential. But put simply, it doesn't matter how brilliant or amazing the research that you do is if the business doesn't trust you. What you deliver will have no impact because they'll read it, they'll look at it, and go, "Yeah, okay, whatever," and they put it down. So that's why I think that building that trust in the relationship with the businesses and your stakeholders is a key piece of doing that. And I think that also leads into, you asked about, like, keeping up with the technology. It is not easy. Everything is exploding so fast these days, and it felt overwhelming to me really at first. And so what I try to do to try to make myself feel like at least that I'm keeping up with the technology is that I try one new thing. My goal is to try one new thing every quarter. So whether that's a new vendor, whether that's new technology, whether that's, like, reaching out to somebody else even within the company who's trying a new thing, but I try to investigate one new thing every quarter. And because I have a pretty solid understanding of the needs of the business and the challenges that I face and where technology might be able to help or alleviate some of these challenges, it's relatively easy to do a quick evaluation and decide whether or not it's something to move forward with. So it's not necessarily a huge, like, time sink. It's just, like, making sure because sometimes it'll fall on the to-do list. Like, it's like, "Oh, I'll just do that later. I'll just do that later”, but -

Brianna - 00:18:07: 

Yeah. I love that you schedule that, and you just say, "I'm going to do this," and you create that routine. It helps you stay curious, which I know you mentioned at the end of your YouTube series videos. And curiosity is a big value for us here at AYTM, so I love that.

Leena - 00:18:22: 

Stay curious. And to the point where I put it on one of my, of course, we have, you know, goals and so it's one of my personal, a little bit of work, but also, you know, a little bit of personal goal is making sure that exploring options. There's just so much out there and changing every day. 

Stephanie - 00:18:39: 

For sure. I wonder if you don't mind, can we go back to the trust issue again? Because I loved the way you talked about that and I was curious, where do you think, and maybe just in your experience, I don't want to force you to talk about insights across every, you know, organization in the world. But where does the lack of trust come from? Is it typically, like, if that's the starting point, is it a cultural thing that just has to be overcome within a company? Is it, what is the root of that, do you think, between an insights function and, like, decision makers or stakeholders?

Leena - 00:19:12: 

I think the root of it is fear, honestly, because as a stakeholder, if you're my stakeholder and I give you data, and if you take that data, you have a choice. You take my data and my recommendations, and you just, you trust me and you're like, "All right. She said this. I trust her. I'm going to go do it." But if you do that and I'm wrong, then it's kind of on your head because you've made the decisions. You have actioned on x, y, and z and so I think it's a fear of making a wrong choice. It comes from a fear of making the wrong choice. And some of it is cultural, I think, for sure. Some of it, typically, I think what happens is somebody's been burned at some point, whether or not the data didn't get in there accurately, whether or not the recommendations were not accurate, like, all of that can be whatever, but they're usually been burned at some point. And so if they have, whether it's by you or even if it was somebody else, getting over that hump and building that trust is going to be even more critical because they're not going to listen because they’re afraid.

Stephanie - 00:20:19: 

Right. It really calls to mind and brings to the forefront just the importance of being really good stewards of insights.

Leena - 00:20:27: 

Yes. Absolutely. And helping, and if there is a lack of trust, having the patience and the empathy to work through that can be really, really frustrating and hard, but it's absolutely worth the effort to bring stakeholders along for the ride.

Stephanie - 00:20:45: 

That makes a lot of sense. One of the things I wanted to ask you about is a specific project, which is rare. I don't get to do this very often. But at EA, you led a global research project with 10,000 responses across seven countries. And, of course, let me know if I got any of that wrong. But my real question is, what were some of the challenges that you faced managing such a large-scale, multi-phase kind of project? And then to follow that, how did you balance the need for a core set of actionable insights with the cultural nuance that surely must have arisen across such a large number of markets?

Leena - 00:21:24: 

That project was a beast, for sure. It was about five years ago, and so we didn't have, like, I wish we had what we have. Even five years ago, I wish we had then what we have today because it would have made us so much easier but I think the key to this project was really leaning on my partners, both internal and external, for advice and feedback because this certainly wasn't anything that I was going to be able to sit in a room and do in a silo. There were just so many teams that I had to work with and ask for feedback and really try to, it was not going to be perfect, like, we knew that going into it that perfection was never going to happen. We had to do our best to culturalize each one, but still maintain the same thread through everything. So then when we got to the reporting, it wasn't like, "Well, this is true for this country, but this is true for this country," because we couldn't measure them together. So it was a huge challenge but the partners, both internal, external, were critical to that process. And then as far as the actionability and the cultural relevance, I did the same thing when it came to the analysis and the report and was leaning on my partners again, both internal and external. And one thing that I did that I don't normally do when it comes to, like, the readouts in the presentation was that I actually customized the reports to the major teams, like, around the business. And so when I did the presentation, I knew what that team specifically, what mattered to them, what was important to them, and so I could have those insights already customized to them. Obviously, that was really time-consuming, and it is not something I would recommend or even do myself for every single project but because this was so huge and so important and so critical to almost the entire business, it made sense at least at a high level to, you know, have some level of customization for the different teams. So the output of the deliverables and the presentation, everybody really appreciated, and it was much more impactful because I was able to say things that mattered specifically to them. So I certainly wouldn't do this for every project, but in this case, what did they say? Though, the juice was worth the squeeze this time around.

Stephanie - 00:23:44: 

It sounds pretty critical to that project's success. That, yeah, makes a lot of sense.

Brianna - 00:23:49: 

So, kind of, talking more generally now, but I think that project might be a good example of this. Market research is often seen as a function that informs business decisions but how do you see the role of insights evolving to become more proactive, like maybe tailoring those reports to make sure people are getting those answers to what matters to them, to be more proactive rather than reactive to market shifts?

Leena - 00:24:10: 

Insights professionals, I know we don't like to guess. We want data to prove or disprove any hypothesis that we have. One of the challenges I know I face is that, historically, research moves too slow. The business is asking questions that require almost immediate answers, not, "Okay. I'm going to execute a research project. I'll get back to you in four months with the answer”, like, that's never going to work. And that was actually a big problem that we had with the project. The 10,000 that we just talked about was so big and so long that in some cases, they were like, "Well, it's too late now. Like, we've moved on with our lives." And so that's another reason why, kind of, the customization was necessary. I had to kind of go back and figure out what the new objectives were, what the current challenges were, and kind of adjust to that a little bit. But as far as how it's changing, in many cases, I think we're moving so fast that it's no longer about "what do customers want today," but "what do customers want tomorrow," because that's just how quickly things are moving. So we're going to have to start figuring out ways, I guess, to be more predictive and proactive rather than just reactive. So until somebody invents a crystal ball, an accurate crystal ball, because my magic 8 ball certainly isn't very accurate at this point, I think as insights professionals, we're going to have to be critical to helping the businesses make these more proactive, not just reactive, decisions. And to play a more proactive role, we have to be strategic partners with the business and so I know I'm always pushing my stakeholders to treat us, the consumer insights team, as partners in the process and not just somebody that kind of hops in and out of the process as needed. And we really need a seat at that proverbial table to really guide the company and to make more proactive decisions and create that proactive capacity. And I think trust plays a big role, and this comes back into play, too.

Brianna - 00:26:12: 

Now in my mind, I'm just thinking about the magic 8 ball. I just need more experience. I just need to level up, and then it will turn into that beautiful crystal ball that's going to give me all the magical answers.

Leena - 00:26:21: 

That's right. Yeah. I'm still working on that. Until technology makes magic 8 ball accurate, we're going to have to use data and insights.

Stephanie - 00:26:29: 

We're going to get there. I can feel it. It's something you said reminded me of, we had a guest on, recently. I don't think that podcast episode has been released, but I asked a question about, like, "Here's how this applies to insights," something he had just said. And I was like, "How do you think this applies to foresight, but research?" And he was like, "Insights is foresight” and it's like something that you just said. You were like, "It's not about today. It's about the future." That's where we are now. Right? It's just a reality.

Leena - 00:26:56: 

Yeah. It is a reality. It's all about tomorrow because today is too late.

Stephanie - 00:27:02: 

It's too late. Yep. 

Brianna - 00:27:03: 

All right. So one of your many accomplishments includes optimizing survey systems to improve both the data collection and then the reporting processes. So in your experience, what is something that survey systems typically get wrong or miss?

Leena - 00:27:16: 

As far as systems, I think we tend to want, especially the stakeholders, they want to put everything in but the kitchen sink in a survey and so, I'm sure there's my stakeholders are sick of me saying it because I'm always telling them, "You can't boil the ocean with a single survey," and that's their knee-jerk reaction. And so one of the biggest challenges with what I did at EA with that process was one of the hardest things was getting people to land on, "What are you trying to accomplish with this? What is going to bring you actionable data? If we had a player sitting in front of you, what would you ask them to make a decision about? Whatever decision is that you're trying to make and so but getting them to be razor sharp with those objectives and what they really needed to know was incredibly hard. And so, but I think a lot of times, misses happen because they're not putting the time and effort, and it is a lot of effort, in that upfront question and definitions and deciding what the objectives are and what the definitions of the metrics that you need to make decisions. So I think that is typically the biggest gap and also, if you don't know what your objectives are and what you're trying to solve for, then you can't action on it. That’s why that first, I think is the biggest gap and the biggest challenge.

Stephanie - 00:28:37: 

It becomes your map across the entire project. Right? And if you don't have that anchor -

Leena - 00:28:43: 

Because a lot of times they'll say, "Well, I just want to know." That's good. I mean, I understand that you want, you know, you want them to want to understand more, but do we need to take up the player's time, which time is money, by asking them things that you need to know versus giving you a data point with which you can actually make a decision? So it's hard. It sounds easy, but it's hard. It's important to put that effort into that upfront work.

Stephanie - 00:29:11: 

For sure. And I think, like, Brianna and I, we can probably speak to that challenge playing out with our client base because I think that sometimes our insights friends are not successful at getting that distilled with their stakeholders because a lot of times what we're having that conversation at the supplier level once we get the brief. "What are we trying to do here at the core, at the end of the day? What are three things that you would be able to action on coming out of this?" And trying to continue that conversation in the same way and maintaining that throughout the project too, I think that another stage where we tend to run into maybe a little bit of a conversation that needs to be had is, "Okay, we've aligned on what are the core objectives in the brief. We'll work on that survey design, review it.” Okay. Well, now they want to send it to their stakeholders. And each stakeholder is like, "Can I put in this question? I have two more questions." Right? And it's like, "No. No. No. We need to get back. What's the core? What's our North Star here?"

Leena - 00:30:05: 

Yeah. That North Star. And but, honestly, I think that's why we have jobs because they're not research. They're, like, they're not, you know, this is hard. It is hard work, and it is our job. I think it is my job. It is my job to help them through that process and get to a point that we come up with effective, actionable objectives because it is hard.

Stephanie - 00:30:28: 

And we feel the same way. It's good for everybody to have that understanding and to be on the same page. Right? Otherwise, you're just not going to have a successful project. So.

Leena - 00:30:37: 

Exactly.

Stephanie - 00:30:39: 

Well, Leena, the conversation has been absolutely illuminating. And what a fun and cool industry you're working and, sounds like you're doing so many cool things. There is a question that we really do like to ask virtually everyone who comes on the podcast. What is the piece of advice that you would offer to somebody who's just starting out in the world of insights?

Leena - 00:31:02: 

My recommendation for anybody is listen and learn from anybody and everybody you can, but don't be afraid to try something new. Experiment, test, and learn. I'll share one of my favorite quotes that was from the puzzles and mysteries talk that I did, and it was from Doug Altman, who is a statistician. And he said, "Everyone is so busy doing research that they don't have time to stop and think about the way they're doing it." So my advice is don't be "everyone." Be somebody who does think about the research, does think about, "Is there a better and different way to do it?"

Stephanie - 00:31:40: 

I love that. Yeah. Highly applicable. Great advice for anybody starting out. So, Leena, thank you so much. This has been a pleasure for both of us, I'm sure. So thanks again. 

Leena - 00:31:51: 

Thank you.

Brianna - 00:31:52: 

Yeah. It was so much, so exciting to get to talk to you. I know we mentioned this off camera, but I'm a big gamer myself. I put a lot of hours into Apex Legends, and cried many beautiful tears in It Takes Two, which is a lovely game. So, yeah, it was really great to talk to you today, Leena.

Leena - 00:32:08: 

Well, thank you. I appreciate the time, and I appreciate you having me on.

Stephanie - 00:32:13: 

The Curiosity Current is brought to you by AYTM. To find out how AYTM helps brands connect with consumers and bring insights to life, visit aytm.com and to make sure you never miss an episode, subscribe to The Curiosity Current on Apple, Spotify, YouTube or wherever you get your podcast. Thanks for joining us, we'll see you next time.

Episode Resources