Mariah Hay, Head of Practices at Pluralsight

Mariah Hay, Head of Practices at Pluralsight

Mariah Hay started her career designing luggage and soft goods. With the advent of the iPhone, human-centered design jumped front and center, not just for product designers like Mariah, but for the digital space. Problem was, and still is, that ethics typically isn’t part of a designer’s education. Rapidly creating and iterating, designers can find they’ve created a monster without even realizing it.

How can we help people access experiences, and how do we end up blocking people from critical services and information, such as medical records, legal systems or education? As the Head of Practices at Pluralsight, Mariah often asks herself this question. Mariah joins us to talk about awareness and aptitude, identifying blind spots and asking for help when it feels like you might be missing something.

 
 

Carl: Hey everyone and welcome back to the Bureau Briefing podcast. It is Carl, and today we're going to have a pretty serious conversation about something we all can play a role in. And it's the ethics in digital product design. Now we have on the show, someone who I am excited to have not only here today, but she's also going to be with us at Design Leadership Days in Seattle this September. I did not know until recently that she was a SCAD grad, so-

Mariah: Guilty as charged.

Carl: I might dig up a little dirt around that. That old Savannah. But she is the head of practices at Pluralsight, which I think most of you probably know. It's a tech skills platform, and we'll talk about that a little bit. She actually helped launch it when she was the VP of product. And she is Mariah Hay. How's it going, Mariah?

Mariah: It's so nice to be here. Thanks for having me, Carl.

Carl: You are welcome. Well, do me a favor. Tell everybody just a little bit about how you got started. I dropped the little SCAD bomb. Here in the southeast, SCAD is, I just love the concept of that school. So tell everybody about how you got into it. 

Mariah: Absolutely. Well, it's funny. Both my parents are artists and art educators. I actually started my career as a frustrated artist. And I had a professor when I was an undergrad who saw me, what I was creating. I was in sculpture creating these weird little artifacts that solve problems. And my professor said to me, "Have you thought about product design?" And I said, "What's product design?" And of course this was before you could Google stuff. So I went and spent a couple months in a library and really, industrial design came to the forefront of my research. And I thought wow, that would be really cool. I could go develop physical product. So I ended up landing in the graduate program at SCAD. And it was really very life changing. I felt lucky to land at that university in particular because they focused on human centered design and how you create products based on solving problems for humans. And we got all the engineering stuff. But there are some schools where industrial design is much more heavily engineering focused. So that's how I got my start in product design and human centered design in general. 

Carl: So you graduate from SCAD.

Mariah: Yes. 

Carl: And what happens next? 

Mariah: Well it's funny, I have moved around from different product types in my career. I actually started out designing luggage and soft goods, believe it or not.

Carl: Seriously?

Mariah: That was my first job out of grad school.

Carl: Oh my gosh.

Mariah: Yes. So I know a good bag when I see it. We'll just put it that way. 

Carl: So back up for a second. I'm sorry to cut you off, so is this all about pocket and zipper placement? Is this about figuring out the way it feels and the way-

Mariah: It's everything.

Carl: Oh my goodness, this is so exciting to me. It's that the design of everyday things, right? It's that you just don't think about it. Okay, let's continue on. That just makes me happy inside to meet somebody who's designed luggage. This is my life.

Mariah: Yes, exactly. 

Carl: So you do that. And was that fun?

Mariah: It was really interesting from a standpoint of something that in industrial design we call human factors, which is thinking about the size of the human body and how we interact with objects. And when it comes to luggage, our target was frequent business travelers. So we did a lot of those upright luggage with wheels that you see people dragging around on airplanes and business cases. And we had a lifetime warranty on the bag. So not only were we thinking this bag is going to get really high use. So it has to be comfortable and functional for a frequent business traveler. But it also has to be incredibly durable because if it's not, it's going to be on the bottom line of the company. So we did an incredible amount of testing and dropping a bag on a corner, or a wheel thousands of times until it breaks and understanding the mechanics of that. That was my life. 

Carl: I could've done that testing for you as clumsy as I am. If you were to just give me the bag, within two weeks, I could have dropped it 1,000 times. So humans stay the focus. And how do you make that transition from physical products into digital? 

Mariah: So I guess it was about, so end of 2007, 2008, iPhone's released, game changer in the market. Not just because it's an interesting object, but because it was accompanied by this new concept of consumers buying apps on an app store. And prior to that of course, a lot of software was just developed for businesses. And usability was not really a thing because a business had maybe one or two choices of software in a particular area and employees just suffered through using it. But now you've opened up this marketplace where the user actually gets to pick the thing that's easiest. It's quick and easy to develop apps. And suddenly, human centered design and user friendly design started to break the tie. 

So around that time, my career veered into the digital space because you can apply the exact same human centered design principles to designing anything, whether it's an environment or a physical product, like a luggage or an app. It doesn't matter. So that's where the jobs were. 

And in fact, the digital space was even more interesting because the cycle of creation and feedback is so rapid compared to physical products. So I just went into that world then.

vRight. Because now you've got so much more data and you can even watch people in real time using something. So that to me, now how do you avoid data overload. I'm just thinking how do you avoid data overload when there's so much that you can look at?

Mariah: I personally don't think we're at the data overload point yet when it comes to product feedback. I think that we're just starting to scratch the surface on what we can harvest and how we can take action on it. 

Where we are today, if you're a product person worth their salt and you create a product. A, you know what you're trying to accomplish for customers. And B, you've already created clear metrics and markers around the measuring what success looks like when you launch something. So you're laser focused on gathering data around that. And then you can quickly pivot and change. So I haven't actually suffered from the data overload. I have suffered from how do we get this data to measure this thing? And now we're moving into a brave new world of applying machine learning to large quantities of data sets to look for patterns that humans aren't necessarily that adept with. 

But honestly, a lot of the products that are out there that are digital products are pretty dumb. They're not harvesting enough data for machine learning to be meaningful yet. But you see now at big companies like Facebook and Google and particularly in the marketing space when you're looking at customers and segments, marketing's probably way ahead of everybody and always has been in data because it behooves them from a sales perspective to be able to leverage mass quantities of data to target very specific people in markets. But from a software perspective or digital product perspective, we're not in that same place as marketing yet. 

Carl: Right. But we're making that move there, right? 

Mariah: We're moving that way rapidly. Yes. In my opinion.

Carl: So this is a great transition into talking about the ethical responsibilities that we have when we do start to know more, when we do start to see more. And I think from everyone that I've met in the Bureau community, and I'll even say in the internet in general in terms of the people that are building apps and tools, and all that. Most of us seem to have a good moral compass. I'd like to think that, and it lets me sleep at night. But what are the responsibilities that we have? What are the challenges that we're not really seeing when it comes to ethics? 

Mariah: I think the biggest challenge right now at this point in time for our industry is ethics is not part of our education. Unlike structural engineers, or lawyers, or medical professionals, we don't take an ethics course. I didn't take an ethics course when I was an industrial designer. So it's just not necessarily top of mind. And you combine that with the fact that we're in such a fast moving, rapid, iterative cycle on creation and feedback. If you're not consciously thinking about that alongside you're creating, you can create a monster without even realizing it. 

So it's a combination. I like to think of ethics as a combination of several things. A, just awareness that it's there. B, diligence and making sure that you're peeking around the corner to see let's give it a stress test and see how might somebody abuse it. That should be part of a product cycle as well. And we're starting to do that more and more. 

I think that teams that use human centered design fundamentally run into those things. Because of part of the discovery of solving a problem, they can see around those corners a lot more readily. But a lot of teams don't do human centered design, so they might create something without realizing it.

And then C, not working in a vacuum. So I think that we're coming from a culture and design where somebody goes into a closet and creates a graphic user interface or a product or offering. And then they offer it and they may or may not be incredibly capable designers, so they might inadvertently create a problem. 

A perfect example is in the US around 2011 when the Affordable Care Act happened, part of the Affordable Care Act was mandating use of electronic medical record keeping systems. So you had just a slew of small startups and even the big behemoth purveyors of the software do a rush to market, which meant they were just creating as fast as they could. But they weren't testing it. So they were ending up delivering these experiences with electronic medical records systems into hospitals and healthcare systems that actually did damage. They slowed down doctors, they hid critical information that ended up killing patients, and is still killing patients today. And they didn't do it on purpose and it wasn't something that was weaponized. It just wasn't diligence in the practice of how you build something. Just like if an architect or a construction company was to throw up a building really quickly just for the sake of rushing to get a building up compared to the building next door. And the building collapses, you probably wouldn't be surprised. So that's what we're seeing happen in that today. So that's another, yet a third ethical misstep that you see that's very common.

Carl: So what was it that alerted you? What was it that got you paying attention and pulled you into this as a focus?

Mariah: So I was asked to do a talk at a conference, I guess it was last year. And the reason why they wanted me to come be part of this group of people is because of my experience in the education space, and talk about the implications of working in education and technology.

And as I was doing my research for the talk, because that's a very broad brush stroke of speaking about something. I started going down this rabbit hole of the ethics around providing education to underserved populations. That's one thing that Pluralsight, the company that I work for, has a focus on. Is being able to provide technology skills training for anybody that has internet access. Very accessible, low cost. 

And as I was doing this, I started going down this ethics rabbit hole, and it veered my attention away from the focus on education and more just about how we help people access experiences in general and how we end up blocking people from things that are life critical, like electronic medical records systems. Or if you are familiar with Cathy O'Neil's book Weapons of Math Destruction, she goes systematically through each industry and shows how lack of feedback loops in products and how products are designed actually block people from legal services, and financial access, and all of the things that create really terrible after effects for populations of people across the world, and particularly within the US in our tech bubble today. 

Carl: I'm just going to take a breath. Because that is a lot to realize and a lot to look at. As people are in their development cycles, as they're in their research cycles, what is it they need to add to make sure that they're not hurting people inadvertently? 

Mariah: You can do a couple of different things. What I always suggest is making sure that you have somebody with the right skills on the team. And this is making sure you, when people are creating something, they have awareness. So if they're a designer, they have the aptitude. A lot of companies will hire people, and they might not necessarily truly understand their skillset. So you might ask somebody very junior to do something that's incredibly complex. And while it might come out the other side maybe looking like it's fixing the problem, it might not actually be the problem. So that awareness and then personal awareness. So I always challenge designers, and product people, and engineers to really know yourself, and learn your blind spots and go ask for help. If you feel like you might be missing something, go ask for help. So that's the first thing.

I know that takes a lot of psychological safety in a company, but if you don't have psychological safety within your company, there are meetup groups, there's industry, you can go online. You can find resources with people today. So that's kind of the first thing.

The second thing is just be aware of, put ethics on your checklist of, "Would I be creating something for the end user that is not intended?" And really chase that down. We here at work, a perfect example is a couple of years ago we launched a skills assessment tool. So we call it Role IQ.

And basically you take this 20 to 25 question quick little assessment, and it helps us put you at the right place in our library because we have a very big library of content. We have over 7,000 courses. And that's an incredibly useful tool for a learner to help create self directed learning. 

However, we also as we're launching this tool, we sell this tool into large companies and the large companies are like, "Well we want to know what people are getting on their Role IQ." At the time, the Role IQ was a 300 point range. And it was broken into novice, proficient and, expert ranges. And we thought, "I wonder how the learners would feel about their data being exposed." 

It would've been really easy for the team to go, "Cool, we can give you that data." But we really wanted to think through if a learner figured out that their data was being given to somebody in use in a way that wasn't intended, they actually would not use skill. IQ, and then they wouldn't keep learning. So we would be blocking our very mission of our company and what we're trying to do.

So the team went and spent some cycles and talked to learners. "What would you be comfortable with? What kind of conversations would you want to have around this with a manager, or a boss, or a colleague?" And we ended up really deducing what we could do and what we couldn't do so that we could still uphold our mission of skill development for a learner, and be able to provide useful data points to a tech leader so that they could have constructive conversations, instead of just blindly giving them what they wanted, which they could use in a way that could take a data point completely out of context. Like what if they would stack rank people against each other? Well, that's not really what our algorithms meant to do. That's not really what the data is useful for.

So we're very careful about those things. And the human centered design process of what would this create for somebody? What is our intention, the outcome we're trying to create? And then making sure we're doing that. I think that fundamentally helps practitioners avoid some of these pitfalls. 

Carl: How many people did you say have used or currently using Pluralsight as a learner?

Mariah: Oh goodness. I think we've got several million users, but that's super ballparky. I would actually have to go look.

Carl: That's great. The reason I'm asking is Project Inkblot is going to be a Design Leadership Days as well. And they're giving a talk on diversity in product design. And I'm just thinking if you've got millions of users, then you see cultural differences in how people learn as well. And I'm just curious, is there something you do with that when you're trying to get human centered and you know that different people with different upbringings learn different ways? Does that get plugged in? 

Mariah: That is absolutely consideration. It started with us acknowledging that and creating awareness around that, because we started with the US based market. But now we have a ton of learners in Europe, Australia, and India. So what we do when we do our user calls, so we go through a process where we start out with something called VOC, voice of customer. Before we ever design anything, we go and we investigate the area and space and see what are they currently doing? What are their main pain points? And that helps reveal what would be useful to design for them. 

We make sure that we're talking to people from different geographies, and we're trying to make sure we include different genders. And now we have a major focus on accessibility. So people that are visually impaired, how would they navigate our product? 

So we're trying to approach it from many different angles and keep our teams cognizant and aware of this. We even have started to dig into things like our users in India that have less substantial infrastructure when it comes to wifi and internet access that are using our product on mobile. What can we provide to them that would be useful? How can we make our direct translations more effective for people that are not English speakers? 

So we really run the gamut of trying to make sure that we're meeting our users where they are. And we're not perfect. But just creating that awareness on the teams is effective. And then also making sure that the people, that we're making our product teams as diverse themselves as possible.

When I started three and a half years ago, there were no women product managers, no women product designers. And now we're starting to get close to half. Same thing for underrepresented minorities on teams. I know for me personally, I worked on a team about a decade ago where one of my really close colleagues, she grew up in Puerto Rico. And she would always raise the flag and say, "Are we considering our Latin American market? How would this translate into Spanish?" And it changed my mindset being somebody that a Caucasian grew in the US. That wasn't something I even would occur to me. So having her on my team, it helps flush that out more. So we're trying to tackle it from a lot of angles.

Carl: And there are a lot of angles to tackle it from, right? That's the thing. But I want to thank you so much for being on the show today, and for just sharing with us what you're doing at Pluralsight, a little bit of your background. And also ways that we can approach things in a more ethical centered, human centered way. So thank you so much, Mariah. 

Mariah: Oh, it's my pleasure, Carl.

Carl: And everybody listening, thank you so much. I hope you learned some good little nuggets today, and we'll talk to you next week. All the best.


Show Notes

You can still catch our second annual Bureau Online Summit tomorrow, July 19. We’d love to have you join us.

Thank you to our amazing partners for making The Bureau Briefing possible!

Mailchimp does so much for digital shops—their agency partner program is phenomenal. Be sure to check it out.

VOGSY is the platform for the Google cloud when it comes to professional services automation. So if you have a creative or digital services firm, look at VOGSY. They can help you run your shop so you can keep more of your money and make clients happier.


The Bureau Briefing Is Brought to You By:

 
 

Comment