The Future of Credentials?

Following the post on the future of work, was thinking about what implications this would have for education, and the most obvious connection between work and education is about credentials. These are the signposts that tell (current or future) employers that a person has a certain set of characteristics. The most obvious example of credentials is the degree which your college/university has given you, telling the world that you meet a certain set of criteria. Often, this criteria is somewhat obscure, and may mean all things to all people, as we can see from the fact that the same credential from different universities mean different things, as seen from the value that people assign to them.

Today, a college degree has immense value for an employer, because the college degree tells the employer that the student has gone through a certain set of courses, and therefore is the right person to meet the requirements of the employers. From the employer’s perspective, the degree tells them that the prospective employee has the skills to be able to build a career. What employers look for is the assurance that the prospective employee has what it takes to fit into the grand scheme of things, to become a part of the larger picture that their organisation represents.

However, as the nature of work changes, as I said before, would such a credential of an ability to learn all things be as important? I believe that in such a scenario, where an individual would be contributing their specific quantum of work in a larger value chain as a ‘freelancer’ the skills of the individual in that particular space would become much more important than their generic ability. This means that organisations would naturally be more interesting in evidence of achievement in that specific area.

Such a shift in focus from organisations would necessarily mean that the ability to demonstrate ability in a particular area would become more valuable than the ability to demonstrate overall/generic ability. Hence, I feel, artefacts generated by individuals in the course of their learning, whether in the form of project reports, or papers authored, or creative work, would probably have a far greater impact than the degree. So, for instance, a paper written by a student on a particular topic, related to the work sphere of the student would likely have far more interest for employers than the degree or the grade would.

In other words, the evidence of achievement, in the form of artefacts, or in the form of eminence would become a far more valuable resource by which to evaluate prospective employees than simply the degree.

Advertisements

The Future of Work?

The way we work has been undergoing massive changes over the last decade or more, but today, I believe, we are at the cusp of a fundamental shift in the relations of work, facilitated by the developments in technology. By relations of work, I mean the role each individual plays in a ‘value chain’ and how the part contributes to the whole.

Before the advent of the modern corporation, people worked not for a corporation (they weren’t around, remember?). Rather, artisans, for instance, manufactured their final product, say a bicycle (if they were around …) as a single entity, and sold their products in a marketplace.

With the advent of the corporation came the concept of people working in jobs where they did specific work, which contributed (often in indefinable ways) to the overall value chain. In this way, the individual would do their part of the work, and pass on their output to someone else, who would do their part of the work (value add) and so on …

This aspect is changing, and, I believe, set to change in bigger ways. As we are seeing there is a trend towards organizations outsourcing their work to freelance contractors. As this grows (and we are seeing this happening more so in the technology sector) we would likely come to a state where instead of many individuals being brought together under the ambit of the organizations, people would work more in their capacity as individuals, being brought together under the ambit of the value chain. This value chain, by definition, would span organizations, which means that we can expect to see, more and more, the value chain being formed as a loose federation of individual freelance contributors, their output orchestrated by a set of organizations partnering together to create a certain set of products or services.

So in terms of work structures this could likely be a move towards towards ways of working the modern corporation replaced, though in ways which are very much the new millennium. This has massive implications on the aspirations of youngsters (I don’t quite rely on the generation nomenclature, partly because I don’t understand it …), in that they can probably no longer aspire to long term jobs and designations may lose their meaning, the content of work, and the satisfaction that generates being the main defining factors there.

In a way, going back in time, but in a 21st century way.


The Future Work Economy

A topic I have been thinking about for a while now is what is the future of work, and of employment. There are a number of questions which come up, to which I must say I don’t have any answers.

One question I think about is the expected mismatch between the demand and availability of work in the future. Another is about the possible mismatch between skills requirement and availability.

Coming to the question of expected mismatch between work demand and availability, one dimension we need to consider, when building future scenarios is overall population. We are told repeatedly that technology is meant to make our lives easier, so we can spend more time with our loved ones. While thats a nice idea, what that means is that in the future, we are likely going to see much more work being automated at a global level, with people having to work less and less. This means lower demand for human resources, which could lead to a future this op-ed from Washington Post describes.

That said, however, there is another aspect which we need to consider. This is the fact that while a number of traditional occupations might not be around a few decades from now, there are likely going to be a number of new occupations, or even industries which could be generated over a period of time, as this piece from University of Kent tells us. While video games have been around for a while, no one could have anticipated the level of growth the gaming industry would see, for instance. New occupations and industries, of course, would require different skills, something we need to prepare our children for.

The other dimension is the mismatch between skills demand and availability. With Europe growing older, for instance, Europe will likely need to import workers, and with Africa growing younger, its quite simple to see where the additional workers required would come from.

This is an illustration of possible imbalances we could see in the future. The larger point here is this … the regions of the world which are well-off are likely to have fewer people in working age-groups in the future, while the regions which would have larger working-age populations would likely be unable to give access to the kind of education required to meet the needs of the job market.

Does this mean that it might be important for certain regions of the world to subsidise education and skill-building in other parts of the world? Should Japan, for instance, invest in education/skill-building in India? In other words, are we headed toward a far more integrated world as the viable solution to the problems of tomorrow?


Predictive Learning

In todays L&D landscape, the way businesses determine who should participate in what training isnt far away from some sort of conjuring act. More often than not, the result of this is a mixed bag, and many of the L&D professionals I speak to tell me that the L1 scores (based on the Kirkpatrick model) are more often than not tending towards the lower end of the spectrum.

There are typically two ways a business determines training participation. One is based on mandated training (usually related to promotion/growth), while the other is nomination by the business manager. Both of these are based on picking up from a ‘menu’ of available programs, and neither really takes into consideration the actual learning needs of the individual.

This is where the idea of predictive learning comes in. The idea here is simple … today, with the technology available to us, especially in the Big Data/Analytics domains, the data about what has worked in the past in what context is available to the organization in a large scale. This data is available based on training, HR, and operations/business data. This rich data can be leveraged to determine what is the best training solution which would likely work in a particular employee context. Like Big Data, this neednt look at the reason (or connection) between cause and effect, rather, look at the linkages as they have been seen in the past.

An important aspect of this picture is that this shifts the focus from training and learning, and from L&D to the individual learner, and makes the entire process people-centric.

One concern with this, though, could be that the outcome of the requirements could be way too granular, and too tailored to individual needs, so as to be unviable from the delivery perspective. More about this later …


HR Change Agenda

Over the last few days, two pieces have appeared in HBR, about the change agenda for HR. One is written by Ram Charan, which talks about splitting HR, while the other, written by Cathy Benko and Erica Volini, about what it will take to fix HR. At the most fundamental level, both these pieces acknowledge the fact that there is a problem with the HR function in the organization. And since they agree on that, they also agree that something needs to be done about it. And thats where, more or less, they move in different directions, as you would see from the blogs.

Lets step back, and take a look at some of the reasons why these problems are there, coming from the perspective of HR practitioners. The first aspect we need to understand is that in today’s world of business, with a steady level of complexity, and increasing levels of disruptive changes, HR managers need to understand details of the business, both internal and external to the organization. Only then can HR managers play a meaningful role in defining organization strategy. In other words, HR managers need to be at the confluence of business management, and people management. However, most of the HR practitioners I talk to are nowhere close to this point. Most HR practitioners are generalists, and not SMEs when it comes to business operations. This means that they need to take guidance from business managers, and formulate practices based on this guidance.

Because that might sound a bit abstract, let me take an example. Lets say a business manager decides that there are some skills lacking in his team. The manager would reach out to the L&D team, tell them what type of training is required, and the L&D team would search through a catalogue, identify the training, and execute the logistics to deliver the training. The L&D team, in this example, has no understanding of the reason for the training requirement, the objective that is to be met, or the outcomes that should come out of the training for participants. In this scenario, the team is essentially fulfilling requirements, rather than giving strategic inputs into the forecasting of medium- to long-term training needs, how these would help address business objectives, and address employee development.

To summarize, it is at the intersection of business and people management that there is a gap, and filling this gap is the need which needs to be addressed. To address this, we need people who have a sound understanding of the complexity and challenges of business, and how people practices can help to address those challenges and meeting that complexity. Whether this is to be achieved by splitting the HR function, I dont know, though the debate throws up more questions than just that. It raises the point that I am talking about here … that in stead of HR practitioners only taking guidance and fulfilling requirement, HR practitioners need to be in a place where they can add strategic value, and that this requires a change in the way HR managers look at the intersection of business management and people management.


TM/HR

Over a period of time, the concept of Talent Management has become a hot topic in HR circles, and many people are talking about the idea. However, I dont quite know any two sources which give the same definition of Talent Management. A number of things I have read include:

  • Talent Management is strategic while HR is transactional
  • Talent Management is about retaining high-flyers while HR is for lesser mortals
  • Talent Management is about managing skills while HR is about managing the policies related to people
  • Talent Management is old wine in new bottles
  • Its a term coined by clever management consultants to make a quick buck (no I havent read that but thats always a pet theory of quite a few people, isnt it?)

Are these true? I dont quite think so. To some extent, I feel Talent Management is the natural progression from the HR philosophy. Essentially, I feel the difference between HR and TM are more to do with how the organization looks at its main asset … people! In the earlier, HR world, people were one of the factors of production, and of creating value for the organization in a sort of undistinguished way, somewhat (though this is not exactly an accurate parallel, but just to create an illustration) like one machine is interchangeable with another machine, and none the wiser.

TM is based on the understanding that each individual is a distinct one, and each one has a distinct personality, a particular set of talents and skills, aspirations and potential which is unique to each one, and so, need to be treated iondividually. This means that the growth needs, based on their aspirations, would be different for different people, which means that development plans, both in terms of skills development and individual growth in the organization need to be tailored to the individual needs of the particular person. And this, I feel, is the primary difference between TM and HR.


Social Eminence

A discussion I was having the other day with colleagues about eminence and the role of social media in creating the persona of people who are experts at things brought out some rather interesting thoughts. One of the ideas that came out was that social reputation is based on one’s willingness to share knowledge. While I completely agree with that, this viewpoint confuses knowledge with the act of sharing. One can actually share things on social media without really knowing much about them. One of the things I see, for instance, on twitter, is that the rate at which people share links must mean they are reading like probably a thousand words per minute. Quite a few people I know just glance through an article or blog, and share it on social media. This is why I say hat sometime knowledge can be confused with the act of sharing.

Another important thing to understand is that it is very easy to manufacture things on social media. You might have seen a number of quotes from Albert Einstein on the web, and I don’t know how many of them are attributable to him. Taking an instance of a talk show I was watching, the analyst on the show was quoting a long-departed leader as having said something. This didn’t quite sound logical to me, so I started searching. After much searching, I found a blog which told how a lie was fabricated and why, and how it was circulated all over the world over social media. The “fact” may find it’s way twice around the world before folks start finding out. Also, there will be a number of folks on social media who will have spread the word, and very few who would take the effort to validate. What this means is that social eminence can be manufactured, and while there are self-correcting mechanisms which are there in the social ecosystem, these methods may not always be effective in a world with a very short memory. By the time you figure out something is wrong, nobody’s really interested, and setting the record straight is a moot point.

The point I am trying to make is that we need to be selective in the sources we subscribe to, and that we need to do our research before publishing something, a thing which is seldom done.