Following the post on the future of work, was thinking about what implications this would have for education, and the most obvious connection between work and education is about credentials. These are the signposts that tell (current or future) employers that a person has a certain set of characteristics. The most obvious example of credentials is the degree which your college/university has given you, telling the world that you meet a certain set of criteria. Often, this criteria is somewhat obscure, and may mean all things to all people, as we can see from the fact that the same credential from different universities mean different things, as seen from the value that people assign to them.
Today, a college degree has immense value for an employer, because the college degree tells the employer that the student has gone through a certain set of courses, and therefore is the right person to meet the requirements of the employers. From the employer’s perspective, the degree tells them that the prospective employee has the skills to be able to build a career. What employers look for is the assurance that the prospective employee has what it takes to fit into the grand scheme of things, to become a part of the larger picture that their organisation represents.
However, as the nature of work changes, as I said before, would such a credential of an ability to learn all things be as important? I believe that in such a scenario, where an individual would be contributing their specific quantum of work in a larger value chain as a ‘freelancer’ the skills of the individual in that particular space would become much more important than their generic ability. This means that organisations would naturally be more interesting in evidence of achievement in that specific area.
Such a shift in focus from organisations would necessarily mean that the ability to demonstrate ability in a particular area would become more valuable than the ability to demonstrate overall/generic ability. Hence, I feel, artefacts generated by individuals in the course of their learning, whether in the form of project reports, or papers authored, or creative work, would probably have a far greater impact than the degree. So, for instance, a paper written by a student on a particular topic, related to the work sphere of the student would likely have far more interest for employers than the degree or the grade would.
In other words, the evidence of achievement, in the form of artefacts, or in the form of eminence would become a far more valuable resource by which to evaluate prospective employees than simply the degree.
The way we work has been undergoing massive changes over the last decade or more, but today, I believe, we are at the cusp of a fundamental shift in the relations of work, facilitated by the developments in technology. By relations of work, I mean the role each individual plays in a ‘value chain’ and how the part contributes to the whole.
Before the advent of the modern corporation, people worked not for a corporation (they weren’t around, remember?). Rather, artisans, for instance, manufactured their final product, say a bicycle (if they were around …) as a single entity, and sold their products in a marketplace.
With the advent of the corporation came the concept of people working in jobs where they did specific work, which contributed (often in indefinable ways) to the overall value chain. In this way, the individual would do their part of the work, and pass on their output to someone else, who would do their part of the work (value add) and so on …
This aspect is changing, and, I believe, set to change in bigger ways. As we are seeing there is a trend towards organizations outsourcing their work to freelance contractors. As this grows (and we are seeing this happening more so in the technology sector) we would likely come to a state where instead of many individuals being brought together under the ambit of the organizations, people would work more in their capacity as individuals, being brought together under the ambit of the value chain. This value chain, by definition, would span organizations, which means that we can expect to see, more and more, the value chain being formed as a loose federation of individual freelance contributors, their output orchestrated by a set of organizations partnering together to create a certain set of products or services.
So in terms of work structures this could likely be a move towards towards ways of working the modern corporation replaced, though in ways which are very much the new millennium. This has massive implications on the aspirations of youngsters (I don’t quite rely on the generation nomenclature, partly because I don’t understand it …), in that they can probably no longer aspire to long term jobs and designations may lose their meaning, the content of work, and the satisfaction that generates being the main defining factors there.
In a way, going back in time, but in a 21st century way.
In todays L&D landscape, the way businesses determine who should participate in what training isnt far away from some sort of conjuring act. More often than not, the result of this is a mixed bag, and many of the L&D professionals I speak to tell me that the L1 scores (based on the Kirkpatrick model) are more often than not tending towards the lower end of the spectrum.
There are typically two ways a business determines training participation. One is based on mandated training (usually related to promotion/growth), while the other is nomination by the business manager. Both of these are based on picking up from a ‘menu’ of available programs, and neither really takes into consideration the actual learning needs of the individual.
This is where the idea of predictive learning comes in. The idea here is simple … today, with the technology available to us, especially in the Big Data/Analytics domains, the data about what has worked in the past in what context is available to the organization in a large scale. This data is available based on training, HR, and operations/business data. This rich data can be leveraged to determine what is the best training solution which would likely work in a particular employee context. Like Big Data, this neednt look at the reason (or connection) between cause and effect, rather, look at the linkages as they have been seen in the past.
An important aspect of this picture is that this shifts the focus from training and learning, and from L&D to the individual learner, and makes the entire process people-centric.
One concern with this, though, could be that the outcome of the requirements could be way too granular, and too tailored to individual needs, so as to be unviable from the delivery perspective. More about this later …
About blogging, I find, the ideas are relatively simple to define and build upon, while the topic is the difficult part. The topic being done, let’s get to the idea of this post.
There are two basic problems with education delivery today, I feel. One is the shortage of skilled teachers, and the other is the piecemeal approach to concepts taught to children. Let’s look at these one by one.
When talking about shortage of skilled teachers, I don’t mean at the aggregate level. The problem facing education is that while in pockets skilled teachers are available, there are also pockets where they are not. In the age of technology, this gap should be one that should be easy to bridge. Having said that, I believe that on can’t rely on purely virtual education delivery when introducing a new subject, which is what is the a of the education system for the most part. So we might need to look at a blended approach towards doing that. For this blended approach, content taught could be divided into three parts: pure theory, guided problem-solving, and practical application. Concepts of a subject could be taught through recorded lectures, for which a panel of distinguished teachers could be identified from the vast pool available. This would help being the best teachers to students in areas where they are not available. At the same time, this would standardize the delivery of basic concepts. Building upon these concepts, to develop skills of applying these concepts, a set of real-life scenarios could be developed. These would need to be delivered in-person, and would help students to learn how these concepts could be applied to solve problems. These could be simulations or case studies, depending on the requirements of the course being taught. Building on these would be labs to experiment and to apply concepts, as required.
The other aspect is the piecemeal approach to teaching. Children are taught mathematics and physics and economics and history as separate subjects. The concepts they learn and their application are demarcated by subject, while in reality, these subjects are interrelated, and so also should their teaching be. To address this, and to develop holistic problem-solving skills, which enable students to see the big picture, appreciate systems, and building systemic thinking, classwork, homework, and exams should be based on a systemic approach, where students are able to see the system as a whole, understand it in it’s entirety, and be able to understand impact of one thing on another.
A recurring question which keeps coming up in discussions is how mathematics should be taught. There is a strong view that given the computing power which is available to us, we should relook the basic maths curriculum. So i went looking, and found this video which i feel looks at the problem in a logical way.
Conrad Wolfram is giving some compelling reasoning for why maths education should change, and gives a description of how it should change, too.
Hand-culcating the mundane way should not be the focus on the curriculum. Rather, the focus should be on building and enforcing the concepts, and leave the calculating to computers. In other words, as he says, students should be taught the procedures which define fundamental concepts, but the implementation of those should be left to computer programs. For example, students should know what a square root it, how it is calculated, but they shouldnt have to calculate, beyond illustrations. And here is the cool part he says … focus on teaching students how to write programs to calculate square roots, rather than having them mechanically execute the procedure for calculating. This will immensely help students clarify their concepts (how can one write a program without understanding the underlying principles to a very large extent?), while at the same time help them become more comfortable with the concept of application of these concepts. In other words, our mathematics curriculum should stress understanding and application (application to real world problems is a very good way of teaching these concepts) rather than stress on the mundane calculations which stress out students as well as parents alike. After all, why should a child lose marks in an exam (thats what happens) if he or she takes the square of 5 to be 10 when all the conceptual aspects of the solution are correct, and the only mistake is a calculation mistake?
Connect this with the post i had written earlier, and a rather innovative picture of mathematics teaching emerges.
For quite a while, I have been thinking that maybe I am the only one who doesnt understand what these words mean. I mean, with the buzz around these concepts (and here I mean the concepts, not the technology), these must be complex concepts to define, but the definitions that I was able to understand were all quite simple.
Big data is just that … BIG! There are essentially 3 things which define it:
1. Theres lots of it! Much more than we had imagined maybe even a couple of years ago.
2. The form of this data is too diverse. There text, images, videos, and what have you. Theres structured data and unstructured data, and data comes with its own context which makes it even more complex to handle.
3. Its being generated at a very fast pace. In fact, writing this blog is adding to this big data, as is your tweet, and those pictures you post on facebook, or those status updates that you like.
I was looking for whether this definition is correct or not, and I came across this video from Ericsson Research, which describes it quite simply with an example. If you want to get past the buzz and get to understand the concept, I would suggest you watch this.
So where does analytics come into the picture? Well, if theres so much of data, theres also the fact that its very difficult to build any coherent picture from this mass of data, and this problem is addressed by the analytics domain. Analytics helps us make sense of big data!
So what does Big Data Analytics need?
1. It requires infrastructure which is able to scale up or down based on the demands of the those who are generating this data, and those who are analyzing it. This means that the infrastructure needs to be flexible, and this can be handled much more easily with cloud solutions, and this is where cloud comes into the picture of big data and analytics.
2. It requires the applications which gather this data. A lot of this data is being generated by automated systems like sensors, and through mobile devices. With the scenario of equipment communicating with other equipment, the concept of the internet of things comes into the picture. Also, with the mobile device explosion, the importance of mobile applications and mobility solutions as an integral part of the picture also becomes apparent.
3. It requires the statistical and technology foundation which will help users or systems to make sense of this data. This is the analytics piece of the picture.
Heres a nice video about an IBM study on analytics.
This is how the picture gets a little clearer, and we can see how the cloud, internet of things, mobility, big data, and analytics are coming together to create a whole new technology paradigm.
In today’s world, where the competition for top faculty is increasing, there is a felt need for being able to enable these faculty to reach out to teach more and more students. At the same time, there is more investment from the faculty in research. This means that there needs to be a way to deliver education differently which enables these faculty members to reach out to more students, while at the same time give them more time to work on non-teaching projects.
We see senior professors teaching the same courses year after year to different sets of students. If you look at the anatomy of a course, it is made up of concepts, applications of these concepts, reinforcing of these concepts through discussions, class assignments, homework etc. This could change if we leverage technology that is available today, to deliver education more effectively.
To begin with, concepts which dont change much year after year need not be taught year after year by senior faculty members. These senior faculty members could record lecture sessions in a moduler way, and these video recordings could be used to introduce concepts to students, for example, these videos teach the concepts of relativity. As you can see, these videos have been uploaded a few years back, but their relevance is still as much as it was when they were uploaded. Unless it is cutting-edge work, it would be, and cutting-edge work isnt taught in quite the same scale as regular courses are.
These videos help us understand that senior professors can help build content which can then be used to teach students year after year, freeing up much time for senior professors to guide and undertake research, and to do a number of further activities like referee papers, and drive innovation in their subjects. Of course, videos by themselves arent enough to teach, which means that in addition to these videos, lecturers could use these videos to teach the application of these concepts, to drive classroom discussions, and to manage the learning process through collaboration, classroom assignments, and that word we all love a lot … homework.
In this way, repeatable activities can be brought online, and the activities which require student interaction could be handled as a separate component from the teaching of theoretical concepts, something i have written about earlier. This could change the way college education is delivered, while at the same time addressing the need for access to top-quality content for a wider range of students, at a much lower cost, at the same time, enabling teachers to focus on teaching, at the same time, on non-teaching related work as well.
Read a nice blog about innovation here. Thought would write some thoughts that came, reading this:
1. Innovation != Invention. Innovation is not the same as inventing something new. It could be about finding new ways of doing the same thing, or using the same way to do different things. How many of you have opened a beer bottle with a spoon? Would you think thats an innovation?
2. Innovation != R&D. Seems to be a corollary of the innovation != invention idea.
3. Innovation != Glamour. Not necessary that innovation create something new, something glamorous. Often times, its also about doing day-to-day things much more effectively, or efficiently. There was a picture a friend had shared on facebook of a guy connecting the drain of the AC with the input of the cooler … water out, water in. Not glamorous, but would you say this is innovation?
In a nutshell, innovation, i feel, is a simpler, more effective, more efficient way of solving a problem. And this way need not necessarily come from a particular team or part of the business. Thats why, crowdsource … innovation is about first identifying a problem (and problems or customer requirements arent always easy to identify), and then to find a solution which is feasible, and which works. I feel both of these make up the idea of innovation. What do you think?