Hadooponomics: Data and Decision Makers: The Human “Resources” for Big Data in HR (Podcast Transcript)

Hadooponomics17Listen to the original episode.

James Haight: All right, welcome back to the Hadooponomics podcast, everyone. This is your host, James Haight. Pleasure to have you back here with us this week, and on the show I’m joined by Evan Sinar. Evan’s a really fun guy to have a conversation with on this topic. We’re talking about the intersection of data and HR. And you know, we’re always talking about how we can replace gut decisions with, instead, data-backed decisions, and how we can sort of make our conclusions better by having data to support them. But we never seem to take the leap to the human side of that equation, where we talk about what’s it mean if we’re not just changing how we’re making decisions, but we’re actually changing who is making the decisions in our company. And Evan’s a fun guy to explore this topic with, because he really talks about how his life’s work, living and breathing, he eats this stuff up, how to use data for companies to actually maximize the human potential at their organization. How you should be organizing your organization from the top down based on the data that you have. So it’s a real fun conversation to talk about the intersection of HR and data here.

Throughout the course of the conversation we actually reference a lot of material. We talk about best practices in data visualization, and we also talk about what are the best practices in terms of the intersection of HR and data. So a lot of resources to go back and check on, and if you miss it in the show, feel free to go to bluehillresearch.com/hadooponomics. We’ll have the show notes there, and we’ll actually have a designated link section to go back to all the materials that Evan references throughout the episode. Otherwise, hit me up on Twitter, @james_haight, we love hearing your suggestions. It helps us make the show better each and every week. So really appreciate that, and keep it coming.

This is a really fun episode, so I’m gonna get out of the way, and enjoy.

All right, welcome back everyone. Today I’m here with Evan Sinar. Evan is the Chief Scientist and Vice President of Development Dimensions International, otherwise known as DDI. Evan, welcome to the show.

Evan Sinar: Thanks so much, James, thanks for having me.

James: Yeah, we’re happy to have you here. I’m excited, we’ve got sort of a different spin on the take that we normally look at here, and so I’m excited to dive into that. But before we go headlong into the interview, I’m curious, just tell the audience a little bit about who you are and what you do.

Evan: Sure, absolutely, yeah. So the company I work for, DDI, we are a company focused on, essentially, developing, growing, selecting, identifying leaders. So helping companies find the leaders that will make them successful, and in turn, making leaders successful in what they do. So we’re very passionate about the work that we do, and it’s a chance to really blend in a heavy research perspective, so we’re constantly doing active research on the state of leadership, different trends that we’re observing, working with our clients to gauge how well their leadership programs are working, and how they can be improved. So certainly an area where we’re intersecting with an HR audience, but also starting to bring in some of the key tools and techniques related to Big Data, analytics, data visualization. So it’s a fun area to be working in, and lots of change, and look forward chatting about it.

James: Absolutely, and part of what’s really fun about this, for me, and I think our audience will enjoy it as well, is on this show we always talk about the transformative power of Big Data, and how it can change organizations, and things along that line. But we almost always talk about sort of finding insights and changing your internal processes. We very seldom talk about how it actually directly impacts who is at the top of your organization, who’s running it, and then who’s in charge of what, and how you structure your company. Sort of what I wanna dive into, right? When business intelligence, and analytics, and Big Data burst onto the scene, there’s certain areas where it makes a whole lot of sense to do it, right? Of course, in sales, optimizing things like lead conversion, or, of course, in lean manufacturing, and all these very quantifiable areas. It makes all the sense in the world that these are the first areas of a company where you’re really applying data, this quantitative rigor to boost outcomes. But now we’re going towards HR, for lack of a better term. I’m curious, based on everything that you said and where you guys are going, how does Big Data and HR connect into one? Why is the time now, is sort of the basis of my question.

Evan: Mm-hm, absolutely, and I think the ordering you described is very consistent with the level of sophistication across these different functions in organizations. So there really aren’t any functions within most organizations that don’t have a data component to them. But they’ve progressed at different speeds, so the operations groups, the sales groups, the logistics groups, the finance group, those groups tend to have worked with Big Data for much longer, and to be much more sophisticated about it. How they’re using it, how they’re pulling it into their decisions, and how they’re using it to present out the information that they share. And in a lot of ways, that’s due to the reasons you mentioned. So those are functions that already had a very strong data heritage. And HR is really coming to that party a little bit later. But at the same time, seeing that for HR to have a seat at the table, so to speak, alongside their other senior leader counterparts from these other functions, HR needs to be much savvier about the use of data, the analytics. People analytics, it’s typically called.

Now at the same time, the data from people can be some of the most complex and challenging data to work with, and so there’s certainly good reasons why HR has been a little bit slower to adopt some of those principles. Because there are ways to measure capabilities, skills, the risk of someone leaving the organization, but it generally takes more time, more effort, and it involves gathering data directly from people, which has its own implications. Because as organizations move to be more data driven, they need to be very transparent with their employees, of course, with their customers as well, about what data they’re gathering, and how that’s being used. And it’s really forced HR to take a stronger and more active stance about the data that are being gathered. And again, back to the timing element of this, that’s a corner that HR’s really just starting to turn. To not just be passive consumers of data, to actually be shaping what data are gathered, how they’re gonna be used, and in many cases, an important factor is how it’s being communicated out to employees. Employees, justifiably, can be very anxious about all the data that are being gathered about them. And there are organizations that are setting up sensors that essentially track where an employee is going with the building, how quickly they’re operating on their job, how much time they’re spending on certain Windows while they’re operating their computer. And it’s seen as being very useful data, but there’s a major risk there with how that data gets used appropriately and ethically. So that’s another area that HR is certainly recognizing, but I’d say they still have a ways to go.

James: Mm-hm, and really, to me, in my mind there’s two really inherent tension points, and you touched on the first one. This idea of striking a balance between efficiency and call it personal liberty, and what information’s okay to collect, and if so, how and when do you use it? But perhaps one of the other issues that you hinted at, at the beginning, and one that I wanna dive into is isn’t there sort of this inherent disbelief or belief that you can’t quantify the human element? I’m thinking specifically back to Moneyball, right, when, for those out there who aren’t familiar, the Oakland Athletics got famous, their manager Billy Beane, for essentially using statistics and data analysis to build a baseball team, rather than just relying on what his scouts saw down on the practice fields in the minor leagues, and there’s a whole uproar. How do you quantify this human ability? How do you project out something that’s inherently not numeric, right, into just numbers on your spreadsheet? And, to me, this seems like the more interesting challenge that you’re probably facing. And so I’ll pause, let you react to that, and then I wanna sort of dance around the nuances of what that actually means.

Evan: Sure, absolutely. I think you’re spot on. The Moneyball analogy is one that’s often used as the data get pulled into the HR function. So Moneyball as a lead in to the people analytics view of looking at data is one that still comes up even with that book being out for some time. And I think that really shows how much of a transition it really is.

I think there’s a lot of truth to that concern, because historically, and you look at something like an interview process, which any organization in the world, almost any organization uses some form of an interview process to identify what employees they want to actually bring on and hire into the organization. However, many of those interviews are done in a very unstructured way so that when a particular hiring manager might ask whatever question comes to mind for them, and not really having a lot of structure and consistency. So there’s a history there of, essentially, HR getting away with data that isn’t very well structured, isn’t consistent. A lot of that’s changed, a lot of it’s changed based on legal pressures that really push organizations to firmly justify the link between a hiring tool that they’re using and actual success on the job. As more systems come online, more broadly, within organizations, to gather data about customers, or a much more detailed reading of sales performance, or that takes into account the product mix that a salesperson might be representing, or their geographic territory. So as those tools become more sophisticated, the data have gotten stronger. There’s also a much higher rate of use of very rigorous assessments and simulations as organizations look to hire new employees in. But also to identify their development needs. So I think that the pace of quantification has certainly changed, but when you think about who’s operating those systems, there’s a history there of HR being not particularly data savvy. And I think it speaks to how or why individuals, and this has been shown through some research, why individuals go into the HR function. They’re very oriented towards helping people. And that’s the driving force, that’s the motivator, and that’s a huge component of what makes HR professionals so successful in their role.

The challenge, though, is that that doesn’t always reside in the same people that also have this more quantitative, analytical acumen. And there’s a change occurring there, where many organizations are, when they’re building HR functions, they’re looking for more analytical skill or, in some cases, setting up their own HR analytics functions. So that, to me, is a sign that that progress is being made. But there’s a long history there of some challenges working with data. And again, in many cases, it’s very justified, because the data just weren’t good enough to really inform people’s decisions in a predictive way. That’s changed, and there’s still some catching up to do there, but that’s how I think the longer term view of that human element question, and you’re absolutely right, that has historically been a challenge. Thankfully, some of the tools and technologies have come along to fill that gap. Now the challenge is how does HR really meet the data where they are and use it effectively.

James: Mm-hm. One of the things I wanted to talk about, and then I wanna transition into how we should be organizing our companies, and the changes in leadership, and things that your work sort of implies or encourages. But one piece that I would like to touch on, I think this would be a good segue, is there is some merit to this idea of the gut decision, right? I think there’s literature out there talking about how the human brain is organized in such a way to make the right decision almost instantaneously. We just sort of know it, and then we come up with a reason to back it up after. And I think to discount that is probably a mistake, but there must be some sort of middle ground. And I’m curious, what does your work show? What does your experience tell us about that?

Evan: Absolutely, yeah, and this has been an area of quite a bit of research in the field that I work in, HR, and I work in a subfield of psychology called industrial organizational psychology. And there’s been an immense amount of research into showing what are the inputs to decisions. And then if you track the effectiveness of those decisions, what’s the role that, essentially, an algorithm or a formula can play versus expert judgment sitting on top of that. And the main finding that you come up with is that, over time, the algorithmically driven decision will almost always be more effective. But that doesn’t mean in every case it will be, because there’s always gonna be a role of expert judgment. And one of the classic examples is called the broken leg problem. So if you’re trying to predict whether someone will go see the opera on a particular Friday night, you may plug all kinds of different variables into the model, but if the person breaks their leg a day before the performance, they’re not going, regardless of what the algorithm says.

James: [laughs]

Evan: So there’s always gonna be a role of some expert judgment and input there. For me, I think the key is how do you take that expert judgment and fuel the model with it, rather than replace it? So how do you take expert judgment, and ideally, it’s across a number of individuals where you can find some common elements that they’re each drawing on as they make their decision. You can use that to inform the algorithm. So the algorithm doesn’t just come up with this on its own, it needs to be guided just as the systems that, you know, there’s a lot of publicity about the systems that beat humans at chess or at Go. I mean, that had to have input from humans, and watching lots of humans playing lots of chess, for example. So that expertise absolutely plays a role to feed into those models. But then what you can do is you can take advantage of the consistency of the algorithm to make those decisions and still have a role. There will and always should be a role of the expert to help guide the final stages of that decision. Or help make some final judgment calls based on information that’s coming in too rapidly to be factored into the algorithmic side of the model.

So I absolutely think they can work together. I think, in many ways it actually elevates the role of human decision makers, because then something that’s such a standard part of the decision making that they’re gonna make the same decision nearly every time based on information, why have them devote their cognitive time and energy to that part of the decision? But rather, reserve their time and energy, and their expertise, to a part of the decision making process where the algorithm leaves off. So I actually think there are models and ways that that expertise and the algorithmic side of the data equation can work very well together, and very effectively.

James: Mm-hm, and this is an interesting segue, and I think it’s clear what you’re not advocating for is a supercomputer to be the CEO, right? [laughs] I think it’s some sort of middle ground before we get to that. [laughs] But part of what I’m curious about transitioning to, in theory, that model sounds beautiful, right? This idea of abstracting away all the routine and mundane decisions, and then only using the human capacity for things that are more nuanced and are more creative, that require sort of this thinking outside of the algorithm. How do we get there? How are we organizing the companies of our future? What’s this going to look like, and is anyone there yet? Is anyone close to there? Or have we still got a long way to go?

Evan: Yeah, it’s a great question, and certainly there’s organizations all across that continuum. A lot of people have probably heard about some of the research that Google has done into what makes up successful teams, and what makes their projects effective. So there are organizations that are applying some of the same analytical techniques that they’ve developed through the operations part of their business, for example, and applying them back to the people side of the equation. And that’s what the field I’m in, industrial organizational psychology, it’s an entire field set up to really try to bridge that gap between the research side and scientific side of our world, and then applying that to practice. And I think there are some foundational findings there, and thankfully, again, there’s a healthy body of research being built up around certain techniques and practices. Whether it be with leadership, or building teams, or hiring individuals, that, on average, you’re gonna tend to be much more successful using these practices than using something else. But then, in many cases, organizations are building up their own analytics functions to really take that research even further to really draw on data that might be proprietary to, or very unique to, their organization and drawing on it that way.

So I think we’re trying to build up the broader set of research that can be, and again, that’s the concept of those are the shoulders that the practitioners in HR and others functions are standing on top of. But then how do you build on that with your own data? And some of that, of course, is just the openness to it. And as you said earlier, that’s not a challenge for anyone on this call, it’s something that HR as a function is trying to make more progress on. I think at this point, there’s pretty broad recognition of the value of data to inform people’s decisions. It’s a matter of building up those skills and snapping up those themes. The next challenge, I think, is getting the data that you need, which isn’t always typically the data that you already have. So a lot of companies have data that was set up for a very different purpose. Again, it might track customer call time, for example, in a customer service organization. And the amount of time that you spend with customers, and in many cases, reducing that, that is an important data point, but it doesn’t take into account how the customer actually left that interaction. Did they leave that interaction wanting to never talk to the company again? Or did they leave it wanting to recommend to their friends and colleagues that they work more with this organization? So it really speaks to a deeper type of data gathering that’s being done. And that’s, I think, the next, really the ongoing transition for these organizations. It’s not just stopping with the data that are available to them now, but planning out what are the data that we really need to answer the questions in the ways that we want to, and to help grow the organization. And so now it changes from using the data that are already there to taking a very business focused and customer focused approach to gather new data to guide them forward.

James: Yeah, that certainly makes sense to me, and to the clear path forward. As we sort of wind down here, you mention there’s a broad body of research of certain things that are more effective for teams, or for leaders, or for you name the function. When you’re working with clients and doing your research, what do you see? Are there any sort of very specific recommendations or takeaways that you tend to bring with you to new engagements, or when you’re helping new people construct these sort of systems? I’d just be curious, from a real practical standpoint, what are the main things that you bring to the table?

Evan: Absolutely, yeah, and that’s where, when you look across a number of organizations, you do start to see some consistent trends. And from a development perspective, that’s a key, that’s really what a lot of our work comes back around to. It’s how do we help leaders grow and develop? And I think part of that is certainly opportunity driven, and organizations are finding ways to try to develop leaders in a variety of ways. So we tend to think of it as a learning journey, we call it. Where it’s not just sending someone a link to an online training program and saying, okay, there’s your training on coaching. That just doesn’t cut it, and it’s just not actual skill development. So we think that the importance of actually practicing, and trying out, and watching others exhibit these skills is key. So when we work with our clients on how do you identify someone who’s gonna succeed in a leadership role, yeah, you can interview them, you can ask them what they did. But it’s much better to see what they can do and actually see leaders in these actual, realistic business environment and scenario. And getting feedback from that is what’s gonna cause a leader to really grow and develop, the opportunity to practice your skills with your own manager, the opportunity to have new developmental assignments that’ll let you try out new things. So those are key principles that we see just in terms of having a development plan in place. So do you, as a leader, know, or as an employee, know how you’re gonna develop, and what you should be thinking about developing as you progress through the next year or two of your career? Do you have that longer term career view in place? And as a role that the leader plays there, the leader has to be open to it and help to enable an employee or a leader to connect in with those experiences. But there’s also a strong impetus on the employee him or herself to be very open and to have a long term perspective into what they’re gonna want from their job, longer term. So as we think about how leaders develop, those are some of the key attributes.

In terms of how organizations should start to think about their jobs, and the roles within their organization, I mean, it sounds simple, but many organizations aren’t really taking accurate stock of what are the key skills, the attributes, the personality factors that lead to success. And you really do have to take a very structured approach to determining what’s been called a success profile, that really makes those who are successful on the job stronger than those who are struggling. And if you do research over time, and with organizations, and differentiate those who are succeeding in the job and those who are struggling, you’re gonna see some consistent patterns there. And it varies a bit by organizations. So in some cases it’s not a one size fits all model. I mean, there are some commonalities like adaptability, decision making, initiative, which tend to be pretty consistent predictors across organizations. But there’s other examples, like how teams are set up, how partnership models are built. For organizations that have sales functions, they can have very different models of how they approach and interact with their customers. That’s where there is a point that you can get to from some of the foundational research that exists. But I think, and we think, there’s always gonna be a role to do more of that research internally to your organization, because that allows you to pick up on the context and the environment that your organization’s own leaders are working within.

James: Mm-hm, I think we see this all the time, right? Someone who’s really successful in one company might be very unsuccessful in another. Just because they might have the right personality traits and whatever it’s called, like extroversion and ability to lead and communicate, and all these things, but because the way the teams are structured, it just doesn’t fit with their style of being successful. So I think that’s an interesting takeaway.

One sort of thought experiment I was just having when you’re explaining this. If we go all the way back to that Moneyball analogy from the very beginning of the show, in theory, part of it’s saying, we can predict future success by looking at some of these other, undervalued statistics, like, say, walks, or on-base percentage, and we don’t just look at other things. So in theory, you can just look at a giant spreadsheet and then pick out the people with these attributes that you want and you can peg them as future stars of success, successful players in your organization. And if you apply that logic to a company, right, if you had all your employees in a big spreadsheet and you listed out these different attributes, maybe leadership, and communication, someone who’s extroverted, and someone who’s data savvy, in theory, you could, to some extent, pick out the people that are gonna be future leaders from that, right? Is that sort of what you’re saying is at least plausible, from a conceptual standpoint?

Evan: It is, yeah. I think the analogy is very appropriate, because with baseball, or football, or any of the sports, they have the advantage of having some fairly structured performance metrics where you’re comparing first baseman to first baseman, and pitchers to pitchers. And there’s a structure there set up around the strikeouts and all the different ratios that are involved. And then, of course, you’ve got the combine where you’re putting them through very structured experiences and tryouts to gauge their speed, and their reaction time, and other metrics. And if you think of that in the analogy of a workforce, it’s not always easy to get data that allow you to compare individuals using the same metrics. That’s why a lot of organizations are investing more into tools that help them put individuals through a similar scenario. So for leadership roles, that may involve actually coming in and going through a day-in-the-life experience where you’re actually playing the role of a particular leader and working and interacting with others that are actually coming in and asking every single person that goes through the simulation the same question, and gauging how they react and how they demonstrate those skills. And the reason that’s important is it allows you to make those more direct and much more accurate comparisons between two different leaders, who may be operating and working in two different countries, two different functions, two very different parts of the business. If you don’t think about ways, and some of this comes through your performance management processes, but in some cases it comes through a separate set of simulations that you put them through. If you don’t have that kind of data, it’s very hard to compare people. And as organizations look to determine who’s gonna lead this company into the future, who are the individuals that we’re identifying as to having very high potential, or really making decisions about who’s gonna fill these spots, all the way up into the C suite roles in the organization. Chief finance officer, chief marketing officer, there’s only a small number of those spots, and as organizations think about who’s gonna fill those roles, it’s a very high risk decision. So the data that they gather about individuals, not only their past performance, but how do you get data to really put them in a scenario where you can understand how well leader A is performing in the situation versus leader B. So it’s how do you set up those tools. So many organizations are using the models of Moneyball, for example, to set up those types of tools and systems to make sure that they’re gathering the right data, and consistent data, to make those decisions to help get leaders in the roles where they’ll be most effective. And then once they’re in those roles, how do they develop them, and grow them, to be even more effective in the future.

James: And it strikes me, I’m gonna play devil’s advocate a little bit, the greatest danger, and possibly even worse than maybe even doing nothing, is having the hubris to believe that you can design the right metrics to perfectly relate to an incredibly dynamic, and changing, and evolving workplace, right? And I think if you just blindly trust any sort of model like that, you’re probably in danger of being worse off [laughs] than if you didn’t have one.

Evan: Well absolutely, yeah, and that’s why, in the field I work in, there’s an immense amount of rigor that goes into these models. And you’re trying to constantly move that needle up, because many organizations are using very unstructured ways of looking at resumes, or of doing interviews. And if we can make the model more accurate than that, then we’re making progress. And there’s thankfully a healthy amount of research showing that. But at the same time, you have to always have a role, and this goes back to the expert judgment comment, there should always be a role that expert judgment should play, as well as development within the role. So it’s important to think of these tools, they’re not designed to slap a scarlet letter on someone. They’re either gonna succeed or they’re not. It’s much more nuanced than that. And that’s why many of the tools that, certainly, we, and many organizations put in place, they’re not only designed to help inform a decision about who gets the job, but much more broadly, they’re designed, regardless of whether someone gets the job, what can we learn from that experience to help them become stronger. Whether it be with this company or with another company. And it provides an immense amount of that developmental information. And so there is a baseline there that any of these tools need to surpass to make sure that it’s making positive progress towards a model that’s not just more effective, but helps make someone more engaged and more satisfied with their job. So that absolutely plays a role.

James: And certainly the impacts of that, and the implications of doing it well, or at least better than nothing, and making progress, are enormous, with tremendous benefit for the organization. I can certainly see that.

So we can probably talk about this all day, but as is usually the case with these episodes [laughs], and our guests, it’s tough to fit everything into the time frame. So Evan, with that, I wanna say thanks for coming on the show. But there’s a lot more we can learn from you, and I’m curious, if our audience wants to learn a little bit more about what you do, and maybe some best practices, and just stay up to date on your research, where are they going, what can they check out?

Evan: Absolutely. I know you mentioned the possibility of putting a few links in the podcast description, and certainly happy to help work with you on that. In terms of me, personally, I’ve really put a strong emphasis, over the last year and half or so, on trying to be more active on social media platforms, Twitter in particular. And you’re not gonna read about what I had for lunch, or anything like that, it’s not the kind of thing [laughs] that I post about. I offer some things of myself, of course, but I really see myself as trying to be a curator of some of the great information that’s out there on Big Data, on data visualization, on HR analytics. I come across a lot of great work from a lot of great folks, and I try to pass it along. So most of the things that come through my head, and that I find that others produce, I try to push it out through the social media platforms. Again, Twitter, and LinkedIn, in particular. I try to do a fair amount of authoring, we have a blog platform on DDI, so I try to come up with some things that I think are interesting, and hope others will find interesting. So can certainly provide some of those links as well, if anyone’s interested in knowing more about the kind of things that I’m thinking about, or the kind of things that I find interesting. And all of those are fascinating areas, of course, as this group well knows. Just the amount of great insights and research that are coming out around Big Data, around analytics, around data visualization, there’s so much out there. I think it’s exciting to be a part of it and try to channel that through to others and try to pick out some of the key findings and key research, key observations, key insights. So I try to do that through those different platforms.

James: Great. Well we’ll have all those links to everything up on the show notes at bluehillresearch.com so our audience can go there to learn a little more about you. But in the meantime, Evan, just wanna say thanks so much for coming on the show. It was a pleasure to have you.

Posted on by admin