Scaling Consumer Insights: Design Thinking, Empathy, Tech & Analytics in Action

Description

In this episode of The Curiosity Current, host Stephanie talks with Lev Mazin, CEO and Co-Founder of aytm, about reimagining how consumer insights are generated and used. Beginning with aytm’s origins under the lean startup movement, Lev explains how design thinking shaped the platform’s evolution from a scrappy prototype to a trusted global solution. He outlines how integrated panels solved the disconnect between survey tools and respondents, and how automation, such as advanced analytics like perceptual, mapping, and MaxDif, made once-exclusive methods accessible to a wider audience. The conversation also highlights aytm’s respondent-first philosophy through PaidViewpoint, where incentives, fairness, and respect became the foundation for data quality. Lev explores the rise of synthetic data, the challenge of distinguishing signal from noise, and why dormant corporate data represents the next frontier for insights. Grounded in curiosity, adaptability, and respect for people, the discussion offers a forward-looking view of how technology and human-centered design will shape the future of research.

Lev - 00:00:01: 

You have to get into the mindset of a typical user in order to be empathetic and to produce something that would connect and be helpful to them. But it's such a hard thing when you have to simultaneously keep in mind personas that are so different from each other in their way of understanding that terminology, and in the way they are predisposed and interested in using different things. And that puts a sharper focus on what kind of platform, what kind of technology company we are trying to be. On one hand...

Stephanie - 00:00:36: 

Welcome to the Curiosity Current, the podcast where we dive deep into what's shaping today's trends and tomorrow's consumers. I'm your host, Stephanie, and I'm so glad you're joining me. Each episode, we tap into the minds of researchers, innovators, and insights professionals to explore how curiosity drives discovery and how discovery drives better decisions in an ever-changing market landscape. Whether you're a data enthusiast, a strategy pro, or, like me, just endlessly fascinated by human behavior, this is the place for you. So get ready to challenge your assumptions, spark some fresh thinking, and have some fun along the way. Let's see where curiosity takes us next with this brand new episode. Welcome to the Curiosity Current. Today's guest is someone I'm really excited to chat with and someone I know pretty well. Today, we're joined by Lev Mazin, the CEO and co-founder of AYTM. And full disclosure, he is also my boss. For anyone who's new here, AYTM is the insights technology company that powers this very podcast, and so we're so happy that Lev could come on for a discussion. Lev has been on this journey for, I think, what, a couple of decades, Lev, helping to build the platform from the ground up with a pretty singular vision: to create a world where curious people can directly engage with technology to find the answers that they're looking for. He has helped steer the ship from a scrappy startup to a company that now connects hundreds of brands with millions of people around the globe, always with an eye on what's next in research, AI, and making insights more accessible. Lev, I am so excited to have you on the show.

Lev - 00:02:22:

Thank you, Stephanie. I'm excited to be here. I'm such a fan of the podcast that you and Matt started, and I'm listening to every episode. I really appreciate all the thoroughness, thinking, guests, and the dynamics that are happening here. Thanks for having me.

Stephanie - 00:02:42:

Absolutely. It's gonna be a fun one. Well, to jump right in, something that I wanted to start with: we talk a lot on the Curiosity Current about how a lot of us in the insights industry kind of found our way to these careers unintentionally or serendipitously. And I know that that's true for you, too. So before we get into your founder story, which I'm super excited to talk about, I would love to know what were you doing before you founded AYTM?

Lev - 00:03:11: 

I was painting and drawing and creating logos and websites, and my wife and I were doing graphic design mainly as a design studio for many years.

Stephanie - 00:03:26: 

Very cool. So then, to really kick off your journey with AYTM?

Lev - 00:03:46: 

Of course. Yeah. That's the origin story that I will try to compact many, many years into just a few phrases, so bear with me. It starts with adjustable-height high-heeled shoes, believe it or not. During the time when my wife and I were helping companies with their corporate identities and websites and whatever they might need from a couple of designers, one of the websites that we did won a random contest. And, uh, we didn't even think that it was that important to apply, but that was such a life-changing decision that we applied and won that contest because the results of it were noticed by my future partner, David Handel, who called me and asked if I would be open to creating a corporate identity, logo, and the packaging and advertising and marketing materials for something that he invented. I was super intrigued and extremely interested in helping, and quickly started this project. And I learned that David, for many years, applied empathy and creativity and ingenuity in just observing life. And one day, many years before that conversation started, he was riding a bus in Manhattan, and he was looking at office workers running in sneakers and then changing their shoes into high-heeled shoes as they were entering the buildings. And he was like, "That can't be very convenient, that could be optimized and improved." And he came up and patented a mechanism that elegantly transforms the same pair of shoes from low-heel to high-heel. And he was about to launch the company and found me and offered to participate in it and create that logo and the whole nine yards of the materials that they were needed. That's how it started. Next thing I know, after a few drafts and a good beginning of the project, at that time, my wife and I had the first interstate-capable vehicle, and we had our newborn kid, and we all got into our Highlander and drove to Henderson, where David and April, his wife, met us for the first time, and we became quick close friends and started, uh, and continued working on this project. And several years later, after this shoes idea became popular on all morning TV shows and all around the country, David arrived to a sense that he would like to work on more software-driven startups because they're easier to scale. And that's where we started prototyping. And the prototype phase was very exciting, exhilarating, and it went right into the MVP lean startup movement that was very prominent and famous at the time. Eric Ries and Steve Blank were the fathers of that movement, and they were everywhere in Silicon Valley and in other clusters of startup activity. And David and I started implementing ideas that he was mostly authoring, and it took us about three months or so to go from an idea to an MVP. We launched it. We had first prototypes. We had first users who came and tried a very wide spectrum of applications. Right? From applications for kids to do their chores to applications for Alzheimer's patients who were struggling to remember their grandkids. And every time, we learned something, and we were really excited about that thing to come to life, but we didn't have people to actually use it and to come and stay long enough to tell us what we did wrong, why there was no product-market fit. Which led us to understanding that we don't quite know if what we are conceiving is going to work or not. We're taking a risk, we know that the majority of startups will fail, that's a given. But we know one thing for sure: that there are at least two guys in the world who could use something like market research, and we could use it at a lower price rate range than what it goes for and at a fraction of the time that it usually takes. And classical market research was comparable. It was about three months. If you ask them questions and you go to a full-service bureau, they had to connect you to the panel company separately and separately to a survey platform. And after three months and about $30,000, you will get your report of all the things that you got wrong. But that was the time that, otherwise, you would, you know, build the next, uh...

Stephanie - 00:09:13: 

Prototype. Right.

Lev - 00:09:15: 

So that led us to an idea: "Let's try the next thing. Let's try building something that, at least, folks like us, entrepreneurs, could use to ask questions outside of their social bubble." Right? Because we all have a tendency of asking if our fabulous idea is the best thing after sliced bread, and of course our friends and family will support us and will say that, yes, it's wonderful. Our first value proposition was for $29.95, it may be better to run a short survey than to spend this money on Starbucks and ask your friends what they think about it.

Stephanie - 00:09:56: 

I love that you built something essentially for yourself and for people like you. I have to think that that made the design process a little bit more intuitive. Right? Now, it doesn't mean you didn't need insights, but, I mean, you had yourself as this anchor to kind of think about, "What is my pain point? What am I experiencing?" And then you're out there solving it. That's pretty fascinating. One of the things that I wanted to ask you about, and you mentioned one thing briefly, which is around integrated panel and how at that time, it was difficult. And I was also doing research at that time, and I was on the supplier side at a small consultancy. And I remember the laboriousness of, you know, having a platform over here and then when it was time to field, going over to another company and sort of connecting these two things. And so I think when I think about what made AYTM so unique and the real differentiators for AYTM kind of over the years, and especially in the early years, one is that integrated panel idea, which was just a real sort of, you know, departure from how most of us were using platforms and panels. And then another thing that I was hoping you could talk about a little bit was this kind of early focus on automating a lot of more advanced analytics. Right? Because and I think the reason I'm interested in it is it's not like you came from a heavy research background where you were like, "Oh, I'm doing these in a really hard, expensive way with a full-service firm," but you still had the presence of mind and the sort of intelligence to know that automating these advanced analytics for some types of users was gonna be this really big deal because that was something that you couldn't get very many places at the time. What was the impetus for focusing on those two areas, the sort of integrated panel and then the automation of the advanced analytics?

Lev - 00:11:49: 

Sure. Those came at very different times of our evolution. The panel came first because by the time when we came up with the idea, the SurveyMonkeys of the world were in existence and already quite popular and trending, actually. So creating just a survey tool wasn't adding anything to the equation. We never thought about ourselves as a competitor to SurveyMonkey because they provided great survey technology, so survey programming and fielding technology. Doing a panel business wasn't interesting to us because it was outside of our radar. We didn't know quite even the term "panel," but we knew that it's a two-sided business model. We have people with questions on one side, and we know that there are a lot of people with answers walking the other side of the street, and we needed to connect them. But how do you connect someone you don't have within your zone of access? So we had to solve the "chicken and the egg" problem. We had to start by building a community, and it's, you know, it's the 2008 real estate crisis, recession, scary time. People were interested in sharing their opinion and learning opinions of others. And right about that time, we decided that, yeah, we have nothing to lose. Let's take this scenic route and see what we can build if we start from scratch. And mind you, none of us had experience of how to build large communities. Right? I knew how to build websites, but I never thought about getting to, you know, billions and millions of users on a single website. So we considered, out of the gate, using Mechanical Turk as a panel source. We even had a conversation with them, and we learned that the quality wasn't the focal point for them. It was the speed of simple, short tasks. And the majority of the workers in their crowdsourcing solution were from India, and that's not a typical target audience for entrepreneurs thinking about a startup in the U.S., which was our first stop. So we thought, "Okay. For this one, it's not enough just to do the application, just to do the software. We need to first create a micro-prototype community that will produce enough active participants willing and opted-in to answer questions so that by the time we launch, and we launched on 10/20/2009, we already would have someone to answer the questions when our first customers would post them." And that's what happened. We had to build that thing from both sides, and the puzzle clicked in when we found ourselves at TechCrunch 50 and launched the company.

Stephanie - 00:14:59: 

That's really cool. And then do you wanna talk a little bit about the advanced analytics piece, too?

Lev - 00:15:05: 

Sure. Sure. To be honest, I had no concept of MaxDiff or conjoint, or any of those terms were completely new to me by the time I had learned about them. And that wasn't in that first phase. That was "unlocked" as an achievement in a game. Yeah. Our second phase of self-realization: we won a few competitions within the startup ecosystem. Right? Went on stage in Moscone and presented what we've done, and we got picked up by the press somewhat. And at some point, we decided to build our second panel community site, which inherited the energy and user base from the first one. We called it PaidViewpoint. It's still with us and growing and providing an amazing experience to both sides of this business model. But we launched it at South by Southwest in Austin, Texas.

Stephanie - 00:16:06: 

Oh, that's cool.

Lev - 00:16:07: 

That was the time when we got discovered by creative agencies. And creative agencies are very good at making things pretty, and they're competing with each other, and they're trying to make the case to the buyers of their product, of their services, that they're in the best place to advise them on this design, on this ad, on this package, on this website. And they used consumer insights, market research as a tool to differentiate, to come across smarter, to be more relevant to the users of their clients. And they discovered us and they said, "Oh my God, this is so wonderful. This is not a solution for startups, this is a solution for creative agencies like us. Let's get in business, let's do it." And they started using us. By that time, we already had several question types, and we were working on several more, like sliders became a thing. We launched only with multiple choice and single choice. Even open ends were added a little bit later.

Stephanie - 00:17:10: 

I love it.

Lev - 00:17:11: 

And we added video responses. That was a moment when we saw that, "Yes, these are real people with real rooms behind them, with real cats walking around and dogs and whatnot." Yeah. "Kids screaming." That was the moment when I felt that, "Yes, the panel is alive. There are real people, and we're onto something." And then creative agencies started asking us questions that we didn't have answers for. They're like, "Oh, there's this thing called perceptual mapping. It's so cool and so smart, and I like presenting it in my deck, and I learned it from a consumer insights partner of ours who's too expensive for us to use regularly. Can you please incorporate it into your platform?" And we're like, "Okay." Went back to the textbooks and learned what perceptual mapping is and what multidimensional scaling is, and reinvented it using all the technology, all the tools available for us at the moment, and brought it over what was at the moment the status quo expectation for data visualization and for ease of use, ease of assembly. And step by step, that continued, making our platform more robust and more sophisticated up until the moment when we heard the term "MaxDiff." And it was surrounded by this glare of inaccessibility, extreme complexity, and that it did take a small army of very smart people to pull off. And I remember this conference that was at IIeX in Atlanta, Georgia, where we presented drag-and-droppable MaxDiff that created quite a splash in the industry. And some people were...

Stephanie - 00:19:01: 

Game changer.

Lev - 00:19:02: 

...were really upset with us, you know, breaking that important bastion of full service. Some people were super excited and intrigued and took us up for the "quality ride" and looked at whether or not it was matching with the methodology that they were doing manually for decades. And it almost matched, but we used some faster methodology and a statistical model, and then we upgraded to hierarchical Bayesian, and it became an ideal match. And it started saving a lot of time and momentum, and, basically, MaxDiff was liberated for a lot of people who couldn't use it.

Stephanie - 00:19:46: 

So accessible. I mean, I think so many of our surveys now, you know, whether they're designed by a startup user all the way to an experienced researcher, include MaxDiff because what a lovely tool that really was gatekept in a lot of ways for a long time strictly by virtue of being rather difficult to implement. So, yeah, big game changer.

Lev - 00:20:09: 

Mhmm.

Stephanie - 00:20:09: 

And it sounds like it's interesting to hear you talk about those early years because one thing that I'm really hearing you talk about is the flexibility that you and David must have just had in your thinking in terms of what you wanted AYTM to be. Like, you certainly had an idea of what its value was, but you let the market kinda lead you a little bit. And when agencies showed up and said, "This is a tool for us," you were like, "Okay. You know what? You're right. It is a tool for you." And when they said, "We would love it if it could do this," you took that challenge and sort of, you know, I wouldn't call it a pivot necessarily, but really, like, taking that information in to continue to grow with the market and adapt. So is that something that you've kind of carried with you over the years, the spirit of that?

Lev - 00:20:54: 

Oh, certainly. When you are coming from a different industry, you don't know what is the need, you don't know the value of what you're building, and you're in a constant listening mode, and you're in a learning mode. So I like this visual. I'm a visual person. I like the visual analogy of us going up the stream in this ecosystem as a wild salmon does. And from startups, we jumped over to the creative agencies phase, and then the next magic happened when they brought us to their clients because they were like, "Oh, we don't know how to even present this beautiful data visualization and this result of the survey. We want you on the call." And we found ourselves in an awkward position of consulting professional market researchers on something that we invented. We built it for them, we know the typical pitfalls and rookie mistakes, but we felt out of our depth to, in many of those conversations, to be honest. And that's how we started, how we actually met with Troubadour at the time, and felt that we need to augment our knowledge and our skills with someone who actually was trained in this discipline and could fortify our offering. And that was the moment when we started talking to end users and consumer insights and CPG companies. At the time, what serendipitously, they were on a converging trajectory to do more themselves, to get hands back on the keyboard, and to gain back independence from the full-service agencies to accelerate their time to insights, to more successfully compete with faster-moving startups infringing on their huge empire territories.

Stephanie - 00:22:51: 

So then switching gears a little bit, I wanted to talk about your background as a designer. You mentioned that that's sort of where you got your start. And I'd like to talk a little bit about your approach to product design at AYTM. And in particular, I'm interested in how you think about the balance of robust functionality, which, you know, we were just talking about with these the automation of these more complex tools and balancing that functionality with ease of use. As we were just talking about, you know, our users really do range from startup founders and brand line managers who are not researchers by training, all the way to trained researchers who sit in large enterprise organizations. That's the reality of today. And how do you approach design in a way that addresses needs at both ends of the spectrum?

Lev - 00:23:41: 

It's an excellent question, and sort of the theme of our product dilemmas and solutioning for all these years. It's... you're considering the cost of adding new functionality. Right? You're constantly hearing, "Let's add this feature, and please add this feature and that feature." And you have to get into the mindset of a typical user in order to be empathetic and to produce something that would connect and be helpful to them. But it's such a hard thing when you have to simultaneously keep in mind personas that are so different from each other in their way of understanding that terminology and in the way how they are predisposed and interested in using different things. And that put a sharper focus on what kind of platform, what kind of technology company we are trying to be. On one hand, on one side of the spectrum, we had SurveyMonkey as a preexisting kind of benchmark of something optimized for soccer moms, for anyone who has a need for a survey. And there's elegance in it and beauty and simplicity and clarity. And on the other side of the spectrum, I always had a lot of respect for Sawtooth, being sort of a Photoshop of survey creation. Right? You can do anything and everything that your heart desires. You can break it easily, and but you can also, you know, build masterpieces if you know what you're doing. But so much was dependent on you knowing what you're doing, on spending the time and learning the methodology and basically being a market researcher. This is a specialized tool with a lot of flexibilities. We were looking for a path between those extremes that would allow us to get further than what was always thought of as a possibility in a DIY platform, such as adding sophisticated research tests in your study. And every time we added something like that, we thought empathetically about the user who may find it useful but doesn't want that functionality to be a weight on their psyche or a weight on their perception of the platform's needs of use. And it means that we... you can't showcase and show everything, every available feature. You have to pick and choose, and you have sometimes to make them quietly sitting and waiting for someone who knows when and how to use them.

Stephanie - 00:26:22: 

Got it. Okay. So I wanna turn the conversation a bit and talk about data quality. We both know that data quality, particularly in the context of panel surveys, is this huge topic in our field right now. There's a bit of a crisis in the confidence that brands have that their surveys are being answered by real humans who are who they say they are and are thoughtfully engaged in answering honestly. Given that we are a panel provider, AYTM has always placed a lot of emphasis on the people behind the data, treating respondents not as a commodity but as valued partners. By the way, when I came on board at AYTM, this was astonishing to me. I've worked on the supplier side, I've worked on the brand side, but I've never worked with a panel directly. And just the thoughtfulness and care around the respondent experience was really new to me. And it's like all that was needed was for you to point out how important that is for me to think, "Oh my gosh, of course. These are not data points in a file. They're real people, right, sharing these opinions." Why is that so important to you? Or how did that become so important to you? And how has that informed how you've built and maintained PaidViewpoint over the years?

Lev - 00:27:40: 

Sure. I love talking about it because it's a prerequisite for everything else to work. Yeah. No matter what MaxDiffs you have, if it's not matched by quality answers, thoughtful answers, there's only so much you can do with it. So again, rolling back to the early days, we had no idea how to build it. So we didn't have the... we didn't inherit the issues and bad practices that were the default choices. I also think that there is a subtle difference between being a panel provider and a panel company. Mhmm. We do provide access to opinions, to respondents, but we don't consider ourselves a panel company, never did, and probably never will be. Because a panel company is treating respondents as a commodity that they paid for, and they need to squeeze ROI from them. Right? They need to turn them around before they churn and get as much value as possible in the process of business. Interestingly enough, some panel companies decided to diversify and show them ads and extract value that way. Some companies were more conservative and said, "No, it should be separated. The advertisement versus listening should not be mixed," and that was the kind of the mantra that I learned early on when I started researching the space. But we started by doing our own desk research. We learned what respondents most frequently complained about their experience dealing with panel companies. And we saw several things. We saw, first, they were lied to in terms of incentives. Very often, they were disappointed that, you know, they spent thirty minutes, forty-five minutes, and they were coming out of that experience without anything. They were promised a lot and came out with nothing. That was... that is demeaning and just upsetting for anyone because the most precious commodity for any human being is the life of their time, and they're spending those precious minutes and sometimes hours in exchange for a black screen and not even "thank you." So that was the first thing we heard. Second, including being screened out after a long time in that experience, right, without getting anything. Second thing that we learned was too long, convoluted, boring surveys. And the third thing that we heard was concerns about privacy and that they are going to be just parting with their information and then being sold to by companies asking to sell, not asking to learn. So we incorporated that knowledge into our brand promises of PaidViewpoint, and we said, "Okay, we will structure it differently. We will pay for every question that was asked." And we illustrated it at South by Southwest by bringing a huge glass jar of coins. And when people stopped by at the booth, we were like, "Oh, hello. What's your name?" And they were like, "Oh, I'm John. Nice to meet you." And I would be giving them twenty-five cents. They're like, "Why? What for?" We said, "PaidViewpoint is paying for every answer that you give." Yeah. And we did, and we still do. If we ask you a question and we found that you're not qualified for this particular survey, that's on us. You did reply. You did come. You did stop your life for a few minutes, and you paid attention. I think it's the least that panel companies can do in order to maintain goodwill and a good ecosystem. The second thing, we restricted the character limits below Twitter limits when we started. We imposed pretty strict limitations on our users in order to protect the experience of respondents, and we pay a lot of attention to the experience. The fact that every survey must be programmed on a YTM in order to be taken by our respondents is an important factor because we put extra emphasis on making sure that it's not going to be broken on a mobile device, that it's going to work well on Androids and Apple devices, on tablets, on smartphones, on desktops of different sizes. It's not an easy thing to do, but if you put a lot of emphasis on that, you get much better results, a stable experience, and that becomes almost invisible, and all the attention of respondents is spent on the targeted, focused questions that our users are asking. The last thing, we promised that we will respect their privacy, that it's going to be used in these particular scenarios, and that we will protect their privacy, and we're not going to sell information that we know about them outside of surveys. And those few simple pieces of trust building and interactions with the respondents created a completely different dynamic. Right? We show them that we're listening by demonstrating that we know if they're being consistent or inconsistent in their answers. We created this system that is called "TrustScore," and it keeps rising as we see that respondents are paying attention and answering thoughtfully. And if something happens that they don't, we have a conversation with them as you would with people. And that created, along with other things that we've implemented over the years, that created relationships between us as a brand and respondents. And that translated into the engagement, into their willingness to invite their friends and family, and bring them into this experience. And in the end, it translates into a great quality of insights that our clients and customers are benefiting from.

Stephanie - 00:34:05: 

For sure. And I think that's what it comes down to in a lot of ways. Because what it sounds like is that you sort of focused on the fact and really presented this notion that the researcher and the respondent are equally important parts of this ecosystem of research. And the more that the researcher respects the respondent—and by that, I mean, respects their time, respects them by not asking, you know, by not launching a forty-five minute survey with really, really long descriptions—the more that the respondent is going to respect that experience and give you the data that's gonna help you make decisions. And it becomes this really, like, self-fulfilling cycle in a really positive way.

Lev - 00:34:52: 

For sure.

Stephanie - 00:34:53: 

Yeah. Very cool. I love it. You've also written about... I wanna talk about a different aspect of data quality. You have this quote that I really, really like, and it relates in some ways to what we were just talking about. But you've written, "The data quality issues, all of these issues that we're sort of wringing our hands about right now and wondering, like, if this is a, you know, an existential crisis, that they shouldn't be viewed strictly or merely as noise that we need to filter out, but as signals for us to explore." And I wondered if you could share how that mindset of kind of curiosity and, you know, approaching it as a signal feeds into our holistic approach to data quality at AYTM.

Lev - 00:35:36: 

I think that it's part of our profession and our responsibility to be curious and to say, "Yes. Bring me more data, not less data." When something happens that we attribute to bad quality, the knee-jerk reaction for many of consumer insights folks is to, "Oh, let's burn it. Let's remove it. It's contaminating our data." Right? I'll give you a simple example. Some time ago, when LLMs GenAI became so accessible that people started seeing them in the responses on their surveys, I started hearing in the industry that, "Let's do something about it. Let's do something about it, but we don't want to think very long about it. Let's just block the copying and pasting gesture in the surveys because it's gonna stop people from copying from GPT charts, and it's gonna free us from that burden of dealing with it." And my immediate response was, "Let's wait a second. Let's think about it. What are we going to be actually losing if we decide to block that event?" Technically, it's very easy. And I think that many survey platforms decided that, "Yeah, why not? Let's just stop people from copying and pasting text in the open ends." But you are... you probably are going to end up with fewer of those LLM-generated verbatim as a result, but you also are going to be making things harder for honest, thoughtful respondents, who maybe copied it into a Word doc to check for grammar, who care extra to see if what they're saying is well-phrased in a language that might not be their primary language. And you don't want to reduce the experience, ruin the experience for those who care the most. You want to know if someone copied and pasted, and it might be totally fine. It might be not fine if it is triggered along with other signals that point you to the direction that that's probably a combination of events and digital body language that is typical for someone who is not a good response. So I am a proponent for knowing more than less and building smart models that can triangulate and react to the sum of all those factors rather than trying to solve this incredibly difficult equation that is going to be perpetually becoming more and more multifaceted with some blunt instruments and simple solutions. That is just not something that will do us good in my experience.

Stephanie - 00:38:31: 

I couldn't agree more. And I love the example that you're using about copy and paste. I have another one that I'll often talk to our clients about, which is just the... but it rests on the same idea that you're talking about that you really wanna triangulate on a set of signals that really let you know that this is a bad respondent and not rely, or a bad response set and not rely on a single sort of metric. That's the example that I often will talk about is cleaning people out for making one mistake in a QC question. And I always say to people, "Do you really want to only have left in the survey the most conscientious people in your target market?" Because I promise you, your target market is full of people with busy lives, with multiple kids, who are running between appointments, who have ADHD. Right? Like, these are real people, and definitely, we want our data to be clean. We don't wanna be talking to bots. We don't wanna be talking to people who are checked out. But there's messiness in life, and we need to allow for a little messiness and still know that you can have a good, honest person giving you fabulous feedback who still makes a mistake in a survey, and that can be okay.

Lev - 00:39:45: 

Absolutely. I couldn't agree more.

Stephanie - 00:39:48: 

Well, oh, I know where I was going next. So on the flip side of this panelist data quality kind of coin that we're visiting, we also have, on the other side, the rise of synthetic data. And here again, there's a growing conversation around its validity in the insights industry. I would love it if you could just share your thoughts on synthetic data. Are we... are we phasing out our humans entirely, or should synthetic data be something that we think about as a complement to real-world data?

Lev - 00:40:16: 

Okay. So the jury is still out. I think that if a company, a supplier, or a consumer insights team inside large organizations dismiss it outright as something that they're not interested in just because it's synthetic and they're interested in organic human information, that's fine, but I don't think that it's the wisest thing to do at this moment. We're actively researching and constantly monitoring the latest technologies available and running a number of experiments and looking for repeatability and depth of what this technology can offer. I think that there are many things that people mean when they say "synthetic data," and it needs to be defined very carefully and analyzed as separate strata of that term. On one hand, I can clearly picture a version of the future, maybe not that far away, when our Siris will become so much smarter than they are today. When they will know not only who we are better than we could formulate ourselves, they could predict what will happen to us in the near future better than we could predict because they would be influencing us. I'll give you an example. Imagine that your smartphone is so smart, and your Siri is in constant conversations with you, and your car is engaging in philosophical discussions as you commute to your work, and your home is ordering stuff that it knows that you will love. The fragrance of your shampoo, it knows something that is new on the market, and it will order, and you will provide your feedback in the shower the next day. Right? But those processes will be automated. I think there is a world in which I'm not going to be useful as a human answering a survey because I don't know what is the likelihood of me changing my car within the next two, three years. It's such an abstraction. I don't think about it. I don't know. But my Siri might know very well that three months from now, it's likely going to recommend to me that I need to trade in my current car because its value is about to go down, and there's a new model on the market that it knows I will appreciate, and it will recommend it to me. And in that world, interviewing our assistants might become a flavor of synthetic data that I think is currently not available as a technology quite yet.

Stephanie - 00:43:05: 

Yeah.

Lev - 00:43:06: 

Right? But that's an interesting world because I will continuously collect so much events using all the sensors that I'm wearing, having all the conversations that I'm having with those assistants in my life, it will be fortified by real telemetry, real GPS data, real conversations, thoughts, purchases. It will be... I will be producing this cloud of real data that will be more reliable than today's answers to surveys that humans with our imperfect memory, social desirability, lack of attention, lack of time are trying to supply for, but maybe in an imperfect way. If we get access to all that factual information through assistants, that may be the holy grail of a future combination of real data with assistants being synthetic interpreters of those. And I can picture a world in which respondents will be allowing limited access to their assistants in exchange for some monetization, some incentives, some discounts. I don't know what else. Social credits.

Stephanie - 00:44:20: 

Bitcoin? Yeah. Yeah. For sure. And it's such a good point because, I mean, it's always been the case that asking about future intent, especially on a timeline, especially for products that are more expensive or purchased infrequently, we know that surveys are imperfect tools to answer those questions, but often, it's the best tool we have in our toolkit. And so the notion that there is a better way to do this, you know, is... it's really an exciting future reality more than anything. So, and, yeah, there are still things that's gonna be nice to have humans tell you. Like, "What do you think of this?" Right? Or, "What's your attitude around this?" This... these very immediate sort of feelings and perceptions. But you're right. When it comes to future behavior and even remembering past behavior, it is not a strength of humans to a large degree. So I love that. Yeah.

Lev - 00:45:12: 

In the most contrast example, it's much healthier to look at my odometer reading to ask, "What's my mileage on my car and how many miles I have driven this year?" than to ask a respondent because none of us will have a clear memory on that unless you are, like, into tracking every mile that you have on your odometer.

Stephanie - 00:45:36: 

Yep. Makes a lot of sense. Well, Lev, we sort of come to the end of our main set of questions. It's been amazing to get to talk to you today. I get to talk to you all the time, but we don't get to sit down and have conversations like this very often, so it's been a lot of fun. Before we close out, there are a couple of questions that we just always like to ask all of our guests, and you are no exception. So the first one is, what is one piece of advice that you would offer to someone who's maybe just starting out in the space of consumer insights?

Lev - 00:46:09: 

I would say that it's an amazing time to enter our industry for several reasons. If you're coming from a technology background, there's been never been such a moment when so many amazing out-of-the-box solutions are available. You just need to combine them and make them sing. If you're coming with an intention to join a corporate consumer insights team, I think you could be the bringer of new mentality, new speed, new quality, new thinking. And the best friend of yours is curiosity at trying every tool that you come across. Trying it and taking it up for a ride, making sure that it does what it promises, embracing and using all the latest models. Make LLMs your friend. Know its limitations. Know when to trust it and when to make fun of it because there's a lot of things that they don't get right. Yeah. Be the person who is fluently talking that technology language of today and tomorrow, and learn from those who've been in the industry longer what you don't know because you just are coming into it, what are the known facts within your organization, within your vertical, what are the good things to do, and rules of thumb, and methodologies that were are proven by decades that you need to learn and you need to teach the technology to do in order for the whole equation to work.

Stephanie 00:47:56: 

Yeah. Makes a lot of sense. It reminds me of this idea too, a lot of what you were saying, of the focus on meta-skills over these sort of very specific skills that you need now, which is like, "How are your Excel skills? How are your PowerPoint skills?" Instead of those things, it's more like learning agility, creativity, sort of your ability to collaborate with both AI and humans. It's really kinda changing the level at which we develop skill sets. So, interesting times.

Lev - 00:48:25:

I think so. Yeah. I think that market researchers have been quite focused on a narrow set of disciplines, but they went very deeply into them. Marketers, on the other hand, for a couple of decades, were learning to master disciplines and tools that are very wide in their spectrum. You know, from search engine optimization to email marketing campaigns to design and mockups and graphic design, all of those things are in the vocabulary of a typical marketer who is a universalist. And I think that something similar is going to happen with market researchers if it hasn't already.

Stephanie - 00:49:11: 

Yeah, I see that too. Well, then finally, and I wanna get to this one with you because I do think of you personally as a bit of a futurist. So it's a great one for you. As technology, in particular, AI reshapes consumer insights, which has been the focus of a lot of our conversation today, where do you see our industry heading? Where are we gonna be in five years, ten years?

Lev - 00:49:32: 

I think the game that is afoot currently, and that will only accelerate, is that of reviving dormant data at large corporations. I think that for many decades, we perfected a skill of answering questions with new data: syndicated data, primary or secondary data. And we have accumulated treasure troves of dormant data that we didn't have the energy, human hours, nor technology to leverage. And I think that we're on the entry points of the phase in our history when that dusty archive of really important, really valuable information for the organizations will become accessible in an actionable and fun way. A fun way is different than, "Oh, yes. I can spend next quarter going through the old studies that I can find on the shared drive, but I'm not going to be, uh, you know, very productive doing that, and I'm not going to be happy doing that either." But it seems like the technology is finally getting there for companies to have an easy access to the past knowledge that the organization has, so as not to duplicate the efforts. Not to launch a survey that someone on the other side of the organization has launched yesterday.

Stephanie - 00:51:05: 

Happens all the time.

Lev - 00:51:06: 

Yeah. Happens all the time. And suppliers, I think, like AYTM, are actively working on solving this problem. Huge tech companies are obviously working on solving this problem. Power BI is, you know, trying to solve this problem, and many other companies. But it's going to be a game of all of a sudden hearing so much more fluency in the flow of data that goes up and down, back and forth, in all directions in the organization, powered and unlocked by the new algorithms that can make sense of it, can make it searchable, can make it relevant. And the main problem and solutioning for it will be "What is a trustworthy signal and insight, and what is a hallucination?" It may be a hallucination by the LLM, but it also may be just an operation of something that has been compared from the existing data that should have never been compared because it was from different time frames, different contexts, different countries, and non-comparable by definition. The fluidity of the data has a huge downside of making errors and issues hard to spot because the answers will be so readily available, and diving into the sources will be, you know, an exercise not for the fainthearted. You will need to go and understand and research what and when it was and who did it, and was it done properly and the algorithm that that what it's supposed to. So I think that there's great news for experienced researchers there because they will be in greater demand. Their experience and attention to details will be instrumental for companies to set those flows of information in a beautiful and prudent and helpful way, and avoid making decisions on something that is very easily accessible but shouldn't be called an insight.

Stephanie - 00:53:21: 

Interesting. So it sounds to me like you're kinda thinking that there's this role of the data steward in the future, right, particularly within brands. And then I also think of the role of then the sort of a counterpart to the data steward, which is the strategic consultant who's kinda taking these insights and driving vision to action based on what they've learned. And, you know, it's hard to do that job now and be an insights person because it's... it's that's a lot to do. Right? Execute all the research and then do this, drive vision to action. But when you free people up in that way who are highly strategic and say, "You know, you've got your data steward to tell you what to trust here. Now take it, and let's create action here," I think you are talking about a really beautiful partnership between different kinds of personas in research that are exciting to think about.

Lev - 00:54:13: 

For sure. I think if I look at everyone involved in this ecosystem, I see all of us making one step in unison closer to the data. Yeah. The decision-makers, the strategy folks will take one step closer to the origination of insights.

Stephanie - 00:54:38: 

Right.

Lev - 00:54:38: 

And all of that will continue accelerating in speed and volume of insights as a result.

Stephanie - 00:54:45: 

Interesting stuff. Well, Lev, thank you so much for your time today. And I have to say, now that you've been a guest on the Curiosity Current, you'll come back and co-host on a few future episodes. Right?

Lev - 00:54:57: 

I'd be delighted to.

Stephanie - 00:54:59: 

Well, we would love to have you. Thanks so much again for your time.

Lev - 00:55:02: 

Thank you so much, Steph.

Stephanie - 00:55:05: 

The Curiosity Current is brought to you by AYTM.

Stephanie - 00:55:09: 

To find out how AYTM helps brands connect with consumers and bring insights to life, visit aytm.com.

Stephanie - 00:55:15:

 And to make sure you never miss an episode, subscribe to the Curiosity Current in Apple, Spotify, or wherever you get your podcasts.

Stephanie - 00:55:24: 

Thanks for joining us, and we'll see you next time.

Episode Resources

  • Lev Mazin on LinkedIn
  • aytm (Ask Your Target Market) Website
  • PaidViewpoint Website
  • Stephanie Vance on LinkedIn
  • The Curiosity Current: A Market Research Podcast on Apple Podcasts
  • The Curiosity Current: A Market Research Podcast on Spotify
  • The Curiosity Current: A Market Research Podcast on YouTube