Quantcast
Subscribe!

 

Enter your email address:

Delivered by FeedBurner

 

E-mail Steve
This form does not yet contain any fields.

    free counters

    Twitter Feed

    Entries in science (6)

    Monday
    Jan232017

    On the balance between data and people

    Quick shot for a busy Monday. If your organization is one of the many that has or has implemented or has at least considered implementing a more data intensive and analytical approach to the HR and talent management, then I recommend taking a quick look at the comments from a young leader in another discipline where data and analytics have completely changed talent management - the world of professional soccer.

    Since Moneyball, and maybe even before that, all kinds of sports (baseball, basketball, soccer, and more), have seen a kind revolution and sea change in the approach to player evaluation, team building, and even in-game strategy driven by the increasing availability of advanced data about player performance and better tools to assess and crunch that data. No leader of even a half-decent professional sports team fails to consider metrics, data, analytics, etc. when making decisions about talent.

    And so it has also come to pass that in the 'real' world of work, more and more organizations are or have embraced similar and data driven approaches in their talent management programs. Assessments that validate a candidate's 'fit' for a role, algorithms that assess employee data to flag flight risks, or models that pinpoint expected future leaders are just some of the examples of how data/science/analytics are being used in HR.

    But if you have begun adopting these data-driven approaches to talent management processes and decisions how can you know if you have perhaps gone too far, or have let the 'human' part of human resources fall too far by the wayside? 

    I think the answer is that it is kind of hard to know for sure, but you probably know it when you see it. But i think it stands to reason that today still, in any field that human performance and human capability are what matters, then it can be dangerous to completely trust the data and fail to consider the people.

    Here's what Julian Nagelsmann, (millennial, for what it's worth), manager of the German Bundesliga side Hoffenheim has to say about blending data, analytics, and the 'human' side of management in forming his approach to leading his team. (Courtesy of The Ringer):

    I studied sports science and have a bachelor of arts. The variety of football data is becoming more and more specific. You shouldn’t make the mistake of looking at football as a science, but there are more diagnostic tools, and the examination of the human body is improving in football: What effect does AstroTurf have on the body? What does lots of shooting do? What does lots of passing do to muscles? There are always new methods and you have to go with the science, but football will never be a science.

    There will be more influence from science to analyze games, and you have to keep educating yourself. But you mustn’t make the mistake of seeing football as something technocratic or based on something that is fed by science. You can develop the person by using scientific aspects in your judgement, but the human is still the focus.

    A really interesting take from a manager of a team of highly accomplished (and highly compensated), professional soccer players. Even in sports, where every move, every decision, every physical reaction to game circumstances can and is analyzed, and the subsequent data parsed and performance conclusions reached - Nagelsmann still cautions us to not forget the humans. 

    In fact, he goes much further than that - he claims the human has to remain the focus.

    Take in the data, be open to the data, don't be a data Luddite - but don't let it become the only tool you use as a manager or a leader.

    Super perspective and advice from a leader who sits completely in the nexus of an industry and discipline that has been historically a 'gut feel' business that is being disrupted by data and analytics. 

    Use the data. But don't forget about the people.

    Great advice for a soccer team or for an organization near you.

    Have a great week!

    Wednesday
    Aug262015

    Learn a new word Thursday, I mean Wednesday: The Dunning-Kruger Effect

    I know that the wildly popular 'Learn a new word' series on the blog is meant to be a semi-regular Thursday feature, I didn't want to let this new term I just came across languish for another 24 hours, hence we have the first iteration of 'Learn a new word Wednesday.'

    Today's word/term helps us understand the problems we have had in our own careers and in our own organizations with an element of the traditional performance management process known as the 'self-assessment' or 'self-rating.'

    You know, that component of the typical performance management process (usually positioned at Step 1), where you and everyone else is meant to attempt to quantify your own skills, competencies, progress towards meeting whatever goals were set for you way back when.

    Let's see, do I give myself a '3' or a '4' for 'Tolerance for Ambiguity?' If I go with the '4', does that make me look like someone who is just trying to prop myself up above the other jokers in the group? But if I only give myself a '3', then that will make it easier for my manager to rate me as average too, since if I only think I am a '3' then why should she disagree with me?'

    It's a nightmare, no doubt.

    Which brings us to today's Learn a new word. Let me introduce, (apologies if you have heard of this before, it was new to me over when I saw it) - The Dunning-Kruger Effect.

    From our pals at Wikipedia, (so you know this is true):

    The Dunning–Kruger effect is a cognitive bias wherein relatively unskilled individuals suffer from illusory superiority, mistakenly assessing their ability to be much higher than is accurate. The bias was first experimentally observed by David Dunning and Justin Kruger of Cornell University in 1999. Dunning and Kruger attributed the bias to the metacognitive inability of the unskilled to evaluate their own ability level accurately.

    Their research also suggests that conversely, highly skilled individuals may underestimate their relative competence, erroneously assuming that tasks that are easy for them also are easy for others

    There it is, scientific proof that shows that we are all, the skilled and the unskilled alike, (substitute skilled and unskilled for 'average' and 'high' performers and you see where I am going), pretty much incapable of accurately assessing our own ability.

    It makes intuitive sense, kind of, that the unskilled or even average performers would assess themselves a little too favorably when given the opportunity - after all who likes to actually admit they are not very good at something? Add into this tendency the crazy pressures and power dynamics that come from the workplace performance management process and you can easily see how self-assessments become really dubious in terms of their value.

    On the flip side, the Dunning-Kruger effect tells us that highly skilled performers will over undervalue themselves and their abilities. If I can do this easily, that must mean it is easy to do, goes their thinking.

    This is likely the fundamental reason why in sports so many of the very greatest players don't actually succeed in post-playing career efforts at coaching. Playing the game at a high level came so easily to them, that they can't see why it does not come so easily to the normal or average players that they have to coach and mentor, resulting in frustration and suboptimal outcomes.

    You might have had a sneaking suspicion as an HR pro of the shaky and questionable value of the self-assessment process. If you did, you know have a fancy term to attach to your POV. 

    Don't blame the player. Blame the Dunning-Kruger effect. 

    Wednesday
    Nov072012

    Why everything takes longer than we think it will

    Think about the countless times that you've been wrong about estimating the length of time it would take to do something, to get somewhere, or complete some type of task or project.

    Something always goes wrong along the way, some unforeseen circumstance puts you or the people and systems you are counting on behind schedule, or we simply, (and fairly consistently), are overconfident in our own ability to gets things done in a given amount of time.

    Why is that the case so often, why do is seem like we are constantly explaining away missed deadlines, or alternatively, griping about the inconvenience that other people's missed deadlines have on us? Well, it turns out there might be a (sort of seems fake but I am going to pretend it is scientific), law that will help us to explain this all-too-frequent phenomenon.

    It's a simple little observation that is called 'Hofstadter's Law', named after American professor and author Douglas Hofstadter and it reads as follows:

    It always takes longer than you expect, even when you take into account Hofstadter's Law.

    Even the more famous and fictional Hofstadter is in on the game:

     

     

    The 'law' tries to describe and help us understand just how difficult it is to accurately estimate the time it will take to complete tasks of any meaningful complexity. And the kicker is the law is recursive in nature - even taking the law into account doesn't prevent us from failing to underestimate the time needed to complete complex tasks.

    It is a kind of cruel equation - we think task 'A' will take four hours - we take into account Hofstadter's Law and add a couple of hours to the estimate - but Hofstadter's Law itself kicks in AGAIN, to remind us it will STILL take longer than we estimate.

    What can we do about this seemingly irrational but often true observation about our weakness in estimating the time required to complete tasks?

    Maybe we should think first about the available time we have, and what realistically, and based on past experience we can reasonably expect to accomplish in that time. Rather than looking at a complex project, or even a series of tasks and trying to count up how much time they will take, (and inevitably underestimating), thinking about the available time first, will force us to think more critically and probably come up with more reasonable expectations of what can be accomplished. 'What can you get done in three hours?' is a much easier question to answer than 'How long will it take you to write this article?'

    So the next time you are faced with the prospect of estimating how long it will take to complete a complex undertaking remember the wise words of our friend Hofstadter, and do your best to not fall into the trap of thinking 'This time it will be different', because it never is.

    Thursday
    Oct252012

    I'll trade you a Carl Sagan for your double of Niels Bohr 

    I am out at HR Technology Europe in Amsterdam the rest of this week, and working on about 2 hours of dodgy sleep on the overnight flight from New York last night, so today's post is totally being mailed in. If you are disappointed, please feel free to fill in the complaint form and ask for a refund.

    I am pretty sure my favorite non-reality TV show, and really the only TV show that I actually try and catch semi-regularly is Big Bang Theory. If you are not familiar with the show, it is a comedy that features as its main characters a group of four friends that all are highly educated university level scientists.  They also happen to be a bit geeky, are irrationally focused on comic books and Star Trek, talk often of how they were, (and in some cases still are), mocked and picked on by 'cooler' people, and often struggle with a world that at times seems kind of stacked against them. The good looking, socially confident, and outgoing people seem to get most of the breaks in life, while their incredible intellectual capacity seems only valuable in the workplace, and kind of a hindrance everywhere else.

    So when I stumbled upon this post on the It's Okay To Be Smart blog titled 'Scientist Trading Cards - Collect the Whole Set!', I immediately thought about the guys on Big Bang Theory, and the probably thousands of science students everywhere that look up to and hold in extremely high regard these legends of science that are depicted in the set of Scientist Trading Cards

    The trading cards, each one representing a legend of science, ranging from physics, to chemistry, to astronomy, are purposely designed to mimic the styles of famous sports trading cards of the past, (the Isaac Newton shares a design with baseball legend Brooks Robinson for example).

    Why bother taking note of these scientist trading cards? Why not just look at them as an amusing bit of fun and an interesting bit of design completed by someone clever with photoshop?

    Well, here's why I think they are worth thinking about. In the HR/Talent/Recruiting industries we seem to have been talking for ages about hard to fill roles in the technology fields, and the seeming lack of suitable, trained talent for many of our most technical and scientific jobs. And while lots of potential remedies for this problem continue to be suggested, things like getting more training for displaced workers, loosening up the H1B visa process to welcome more foreign workers, and even increasing the numbers of 'smart' automation in our businesses, we never seem to attack the problem at a basic, more fundamental level.

    Namely, convincing the next generation that science, technology, engineering etc. are not just important, but they can and should actually be careers to aspire to, and possess incredible legends, heroes, and role models - just like the professions that we routinely train our children to idolize - athletes, entertainers, and reality TV personalities. What if we could convince kids that being a great scientist could actually get them there own trading card?

    I dig the scientist trading cards. I wish they were actually real. I think I'd like the kinds of kids that would want to collect them.

    Friday
    Jul132012

    Off Topic - Schrodinger's Cat

    Last weekend while enjoying one of my favorite pastimes studying particle physics watching a loop of replays of Big Bang Theory, I ran across a reference to the famous (in scientific circles anyway), illustration of an aspect fo quantum theory called Schrodinger's Cat. Ed. Notethe 'o' in the name Schrodinger should have the two tiny little dots over it, but I don't know how to render that in this text editor. Which is also another indication I probably should not be attempting to post on particle physics or quantum mechanics. But let's press on anyway.SCIENCE!

    So here's the basic idea of the Schrodinger's Cat illustration, (text lifted heavily from the What is? definition page, (apologies and thanks in advance):

    We place a living cat into a steel chamber, along with a device containing a vial of hydrocyanic acid. Note: he did not actually DO this, it is just an illustration. There is, in the chamber, a very small amount of hydrocyanic acid, a radioactive substance. If even a single atom of the substance decays during the test period, a relay mechanism will trip a hammer, which will, in turn, break the vial and kill the cat. 

    The observer cannot know whether or not an atom of the substance has decayed, and consequently, cannot know whether the vial has been broken, the hydrocyanic acid released, and the cat killed. Since we cannot know, according to quantum law, the cat is both dead and alive, in what is called a superposition of states.

    It is only when we break open the box and learn the condition of the cat that the superposition is lost, and the cat becomes one or the other (dead or alive). This situation is sometimes called quantum indeterminacy or the observer's paradox: the observation or measurement itself affects an outcome, so that the outcome as such does not exist unless the measurement is made. (That is, there is no single outcome unless it is observed.)

    Did you follow all that? Until you open the box the cat is not either dead or alive, it is both dead and alive, and only by opening the box and observing the contents does the cat actually become one or the other.

    When I first looked this illustration up and read through the description of the (fake) experiment it really seemed kind of silly, I mean it may be a valid explanation of the quantum theory of superposition, but beyond that it really could not have any possible implication to anything I care about in the real world, i.e. the NBA, barbecue, Pawn Stars, right? We know the cat can't really be BOTH dead AND alive at the same time. It is one or the other, but not both. It just doesn't make sense.

    But then I thought about it just a little bit more, and then in the context of many of the projects, roll-outs, system deployments, and other change management kinds of things I'd either been involved with or at least observed in the workplace and it started to make a little more sense to me.

    Truly, how the project or change was presented and maybe even more importantly, to whom the new shiny tool and improved process was pitched to first did indeed impact the actual result. If we made our pitch to the right leader or executive first, and couched our pitch in terms that allowed Ms. Executive to see how they would benefit from whatever goods we were hawking, then we had a much better chance for success. 

    And if we did not make the case early, and convincingly, and to the right folks, well then we pretty much were ensured of failure, or at least, lack of impact, i.e. eventually the box gets opened and the cat is dead.

    The thing is both outcomes, project success or failure, well they definitely both existed the entire time. It was only revealed which outcome actually became real until the impacted organization opened the box as it were and had a look inside.

    That's it from me on this. And after having a quick scan though before hitting the 'publish' button, I too realize this very post is both dead and alive at the same time. It's only now, as you read this final sentence, that the actual state is determined.

    I hope the little post survived...

     

    Below is a short clip explaining the Schrodinger's Cat illustration, have a look if you are still intrigued, (email and RSS subscribers please click through)

     Have a Great Weekend!