Skillsoft Labs

Creating "Purpose Built" Technology Skills Content

March 16, 2021 | by Mike Hendrickson

Building Our Roadmap

Our Tech & Dev team at Skillsoft builds content that we believe meets or exceeds our customers' demands and tech professionals around the world in every major industry vertical. You might wonder how we decide what to build to produce content that is required and needed for tech professionals to remain well-skilled and able to meet today's challenges. Many components go into building great content, so let us walk through our approach while sidestepping a couple of our more proprietary pieces.

We start by trying to understand where a given technology is in relation to market visibility, and more importantly, market adoption. If you ask yourself which is more important; to have content on innovation that perhaps only a handful of key developers need, or have enterprise-grade content addressing existing technologies. Hopefully, your answer is both. But there still is a balance to strike. Let us explore how we go about balancing these requirements.


To ensure we keep at the forefront of technologies, we typically look at what Academia has on its radar or course offerings – specifically academic Extension Schools’ offerings. We balance that with examining job board postings to figure out and what skills are currently being looked for by employers. We also make sure to keep in touch with analysts and the reports they produce on tech adoption and trends that will affect professionals everywhere. Both Gartner and IDC have some well-thought-out reports we utilize. Many of these analyst reports center on where Tech leaders plan to invest both short-term and long-term.

We then add a nice dose of analysis on the tech watering holes where developers ask questions and get answers. Plenty of pain-points surface when filtering through these data sets. But our balancing act is not over quite yet. These are external inputs. We also look to our internal data sets, which include our consumption patterns, search data, NPS results, and Like rates for topic areas and individual assets.

While all our analytics are integral to figuring out what we build, another strong signal and perhaps the most critical aspect of building content for purpose is having conversations with customers. We have three distinct kinds of customers to have conversations with. The first is the HR person who typically hears from their organizations’ tech learners. Secondly, we have a tech leader, a line of business leader who tries to get his or her team all aligned around technology or method that will help their efforts. The third is the individual learner, who is the one consuming content. All these learning constituents provide invaluable feedback on what is important to them, their teams, and their organization. Another area for us to make sure we evaluate continuously is what Certification vendors are doing with the various certifications they offer. We want to make sure we are aligned with their requirements so that our learners are prepped on the latest standards. Speaking of standards, we also make sure we keep up with standards bodies when items change in the various industry standards. NICE, ISO, NIST, ECMA, ANSI, IEEE, IETF, W3C, and NCITS are examples of de facto frameworks/standards we keep our eyes on regularly.

We believe it is important to look at all these factors before making any budgetary commitments and promises on our roadmaps. We feel this broad range of inputs to consider will genuinely make our content more usable for more diverse sets of potential learners.

With these components all helping to shape our curriculum, we feel our content build plans will align much better now and in the future for discriminating professionals around the world. If our content does not meet your expectations, we would love to hear from you—Mike dot Hendrickson at skillsoft dot com. We will keep your feedback in mind as we set out to build our future content.


Our goal is to select the course components and modalities that will best enable the learner to build durable skills and retain the knowledge gained. Now we will explore some of our processes for how we build our content we have identified through our roadmap building inputs. Given the delivery medium of self-paced online learning, we need our courses to appeal to a wide range of multi-generational learners with varying motivation and self-directedness levels. Whether a learner is looking to build their skills in an entry-level position or a seasoned professional seeking to update to the latest technologies, our courses must meet a variety of learners’ needs.

To appeal to all our learners, we design courses that are designated with one of four experience levels, as seen immediately below. Each experience level may have several personas in each area, with different learning goals. In general, these are the various levels of expertise we target on our continuum. We think it is essential to look at these in a gradient manner as people progress through the different levels of skill mastery.

  • Novice: This content helps learners achieve overall literacy with a particular topic. The topics may cover a range of proficiencies and/or certification exams.
  • Beginner This content provides learners with base knowledge in each area by introducing the learner to technologies, concepts, or skills. The goal is to build a basic competency around new concepts and how they relate to typical applications. The topics may cover entry-level certification exams.
  • Intermediate This content assumes that learners have a fundamental understanding of specific technologies, concepts, or skills. The goal is to become proficient with these concepts by applying them in practical applications. The topics may cover 2nd-tier certification exams.
  • Advanced This content assumes that learners have a mastery of specific technologies, concepts, or skills. The goal is to become an expert in these concepts by applying them in specialized applications. The topics may cover 2nd or 3rd-tier certification exams.

With these learner goals in mind, we then look to apply our various presentation styles to the content and define the learning objectives. The first is a conceptual-based (slide-based) video where concepts are illustrated and discussed to help meet the topical objectives. We do try to make these engaging, but not over-produced, to help keep the learner attentive. This is to ensure that the learner has the requisite knowledge of being either literate or competent with the subject matter and well prepared for intermediate or advanced level content should they choose to continue their learning. An introductory course typically provides the grounding for the rest of the topic. Example here:

The second style is our demo-based-video, which is just like it sounds, demonstrating concepts, tools, applications, methods on screen through an application, framework, or some device or system.

Both styles incorporate real-world examples shown in a practical, meaningful manner. In many cases, we will add hands-on labs to ensure the learner can do what they have learned. And each course has an assessment that tests the learners’ new knowledge against the stated course objectives. These stylistic elements truly do help learners retain new knowledge, build durable skills, and be assessed on their knowledge of the topical objectives. The combination of knowledge checks and assessments along with a safe practice lab environment truly makes a difference in any tech professionals’ learning. And if a learner still wants more, we offer mentoring in Percipio and most often a related Bootcamp that will take a learner down a deeper path to mastery. We determine what topics will be covered in Bootcamps through our analysis efforts mentioned prior, and what we are hearing from customers and learners.


How do we assure quality with all these variables in motion? Three major areas are constantly monitored and evaluated: instructor selection and performance, editorial reviews, and Video-quality reviews. We thoroughly evaluate instructors’ capabilities through many interviews and samples of their teaching acumen through recorded video samples. They need the technology credentials and the ability to teach, communicate, and engage their eventual learners. We keep a close eye on metrics that measure their success. This is quite a bit different than letting pretty much anyone put their video course on a platform and watching to see if it is utilized -- the crowdsourcing model comes to mind here. The problem is just because it becomes popular does not mean the subject is being taught well or effectively. This is particularly true at beginner levels where those learners may not know the difference between good, bad, right or wrong.

Similarly, we evaluate our course design, including storyboards, objectives, writing & presentation style, and all graphical and coding elements. We also assess how much interactivity is needed for a particular topic. All in all, our quality check ensures that our content contains a good balance of conceptual slides, imagery, and demonstrations based on project requirements and that the instructor is clear and concise. Striking this balance helps the learner focus on what is being taught and not what is cluttering the screen with unnecessary information and imagery.

Although time-to-market is important to us, quality content is our guiding principle over expediency.