Loading....

ETL TRAINING AGENDA

Objective of the Course:

ETL training comprises of the fundamentals of Data warehousing and Business Intelligence. This combination course teaches you how to extract, transform and load processes including verifying data and identifying anomalies.

ETL training modules Include:

  • Data Warehouse Concepts, Architecture and Components 
  • ETL (Extract, Transform, and Load) Process 
  • ETL vs ELT: Must Know Differences 
  • Data Modelling 
  1. Conceptual
  2. Logical
  3. Physical Data Models
  • OLAP vs MOLAP (Multidimensional Online Analytical Processing) 
  • OLTP vs OLAP: What’s the Difference? 
  • Dimensional Model in Data Warehouse? 
  • Star and Snow Flake Schema in Data Warehousing 
  • Data Mart, Types & Example 
  • Data Warehouse vs Data Mart 
  • Data Lake & It’s Architecture 
  • Data Lake vs Data Warehouse: Know the Difference 
  • Business Intelligence? Definition & Example 
  • Data Mining Process, Techniques, Tools & Examples 

SQL Basics: 

  • Introduction to SQL 
  • Installing Database Management tool and creating database 
  • Creating a table and inserting data 
  • Querying the table 
  • Aggregating data 
  • Modifying/ updating the tables using queries 
  • Project: Design a store database 

ETL Tools: 

  • Overview of Pentaho , SSIS and Informatica 
  • Data integration in general 

Setting up Pentaho 

  • Setting up the environment 
  1. Install and operate the data integration withpentahokettle. 
  2. Including database management and profiling the database as a source.
  3. PDI,jdbcdrivers and other libraries 
  • Pentaho kettle environment Walkthrough 

Working with Pentaho 

  • Full data integration with Pentaho kettle 
  1. Setting up repository
  2. Difference between Job and Transformation
  3. Reading Data using different Input steps
  4. Setting up variables and parameters
  5. Calculations and string functions
  6. Loading data to different destination servers, flat files, excele.t.c
  7. Automating the jobs using schedulerand alsousing command line for linux environment 
  • Project overview and detailed design 
  • Create your own project with Pentaho 
  • Step-by-step guidance for Project 
  1. Connect to various data sources
  2. Data conversion and Manipulation
  3. Work with variables
  4. Load data to destination tables.

Final Phase: Deploying the project: 

  • Deploy the project. 
  • Error logging and handling 
  • Performance monitoring 

Click here to contact us and learn about all our trainings!

Learn more about our training program.

Get in touch today!

Back To Top