Key Areas of Competency:

  • Access content and run reports
  • Navigating through the system and tracing the life cycle of a data item
  • Analyze data


  • Excellent communication skills
  • Microsoft Windows operating system basic experience
  • Basic Internet and web browser usage experience
  • Knowledge of an organization’s business intelligence process and reporting needs
  • Basic knowledge of XML


Basic understanding of:

  • Hadoop, Big Data, and how Big Data solutions work on cloud
  • Business analysis software
  • Customized reports
  • Query models
  • Effective prompts
  • Examine report specification
Skills Roadmap Skills Roadmap

Skills Roadmap

Gain insights into key industry skills and competencies by utilizing these roadmaps to enhance and transform curriculum for new collar jobs.

  Data Science Fundamentals

Data Scientists extract knowledge and insights from structured and unstructured data. They draw upon the practice of data analysis, using predictive analytics, data mining, text mining, pattern recognition, data modeling, machine learning, and various statistical methods in order to solve large scale problems and understand the meaning behind vast data sets.

  • Examine Data Science from a practitioner point of view and introduce topics from basic concepts and methodologies to advanced algorithms.
  • Concentrate on data compilation, recommender systems, preparation and modeling, regression, time series analysis, multivariate analysis, Bayesian methods, stochastic models, clustering, etc. that occur throughout the life-cycle of data science.
  • Gain practical knowledge with open source tools and statistical packages (e.g., R, MATLAB, Python, SPSS).

  Applied Data Science with Python

  • Increase experience with object-oriented/object function programming/scripting languages such as Python and Java.
  • Expand understanding of statistical programming in a language such as R or SAS.
  • Solidify data skills in Python before diving into machine learning, big data and deep learning in Python.

  Hadoop Fundamentals

  • Evaluate Hadoop's core components and support of open source projects.
  • Illustrate the Hadoop conceptual design, and how to use the application to manipulate data without the use of complex coding.

  Spark Fundamentals

  • Introduce Apache Spark as a general engine for large scale data processing, including fundamentals of this program's design and its application in the everyday.

  Database Fundamentals

  • Review relational SQL and NoSQL databases, such as DB2, Oracle, Postgres, and Cassandra, as well as optimal design of ETL infrastructures, using tools like Datastage, for a variety of data sources.
  • Consider data pipeline and workflow management tools, such as InfoSphere, Informatica, etc., and data warehousing to keep data separated and secure through replication and failover techniques.
  • Recognize the data warehouse life cycle, as well as star schema and/or denormalized database design.

  Design Fundamentals

  • Explore consumable data visualization techniques, practical or applied understanding of UI/UX design principals and/or data visualization theory.
  • Develop interactive dashboards to help drive decision-making.
  • Establish an understanding of business drivers across sales, marketing, performance reporting, etc. and how this understanding drives outcomes.

IBM Digital badge IBM Digital badge

Make your courses IBM Digital badge eligible.

Key software and other resources

Free IBM software, platforms, and services

Other tools and system requirements

Other tools and system requirements

Explore IBM Skill Accelerator Roadmaps

Click here to learn more about how IBM can partner with your campus