Tuesday, November 2, 2021

Learn SQL (Training)

Colleagues, master SQL, the core language for Big Data analysis, and enable insight-driven decision-making and strategy for your business. Perform analysis on data stored in relational and non-relational database systems to power strategic decision-making. Learn to determine, create, and execute SQL and NoSQL queries that manipulate and dissect large scale datasets. Begin by leveraging the power of SQL commands, functions, and data cleaning methodologies to join, aggregate, and clean tables, as well as complete performance tune analysis to provide strategic business recommendations. Finally, apply relational database management techniques to normalize data schemas in order to build the supporting data structures for a social news aggregator. Highly marketable, skill-based training modules include: 1) Introduction to SQL - learn how to execute core SQL commands to define, select, manipulate, control access, aggregate and join data and data tables. Understand when and how to use subqueries, several window functions, as well as partitions to complete complex tasks. Clean data, optimize SQL queries, and write select advanced JOINs to enhance analysis performance. Explain which cases you would want to use particular SQL commands, and apply the results from queries to address business problems (Project: Deforestation Exploration), 2) Management of Relational & Non-Relational Databases - databases need to be structured properly to enable efficient and effective querying and analysis of data. Build normalized, consistent, and performant relational data models. Use SQL Database Definition Language (DDL) to create the data schemas designed in Postgres and apply SQL Database Manipulation Language (DML) to migrate data from a denormalized schema to a normalized one. Understand the tradeoffs between relational databases and their non-relational counterparts, and justify which one is best for different scenarios. With a radical shift of paradigms, learn about MongoDB and Redis to get an understanding of the differences in behaviors and requirements for non-relational databases (Project: Udiddit, A Social News Aggregator). 

Sign-up today (teams & execs welcome):  https://tinyurl.com/nyfks6ne 


Much career success, Lawrence E. Wilson - Online Learning Central


Monday, November 1, 2021

Linux Foundation Certified Engineer (LFCE - LFS311)

Colleagues, the  Linux for System Engineers (LFS311) training will equip you to become a Linux Foundation Certified Engineer (LFCE ). The need for sysadmins with advanced administration and networking skills has never been greater, and competition for people with experience is fierce. Whether you’re looking for expert test prep for the Linux Foundation Certified Engineer (LFCE) certification, need training to help transition to Linux from other platforms, or you’re just brushing up on these vital admin and networking skills, this instructor-led course will teach you what you need to know. Learn advanced Linux administration skills including how to design, deploy and maintain a network running under Linux, how to administer the network services, the skills to create and operate a network in any major Linux distribution, how to securely configure the network interfaces, and how to deploy and configure file, web, email and name servers. Skill-based training modules - mapped to the LFCE cert exam include: 1) Linux Networking Concepts and Review, 2) Network Configuration, 3) Network Troubleshooting and Monitoring, 4) Remote Access, 5) Domain Name Service, 6) HTTP Servers, 7) Advanced HTTP Servers, 8) Email Servers, 9) File Sharing, 10) Advanced Networking, 11) HTTP Caching, 12) Network File Systems, 13) Introduction to Network Security, 14) Firewalls, 15) LXC Virtualization Overview, 16) High Availability, 17) Database, 18) System log, and 19) Package Management.

Enroll today (individuals & teams welcome): https://tinyurl.com/3nh9hkna 


Much career success, Lawrence E. Wilson -  Online Learning Central


Thursday, October 28, 2021

Learn JavaScript Unit Testing

Colleagues, the JavaScript Unit Testing enables programmers to catch more of their own bugs before deploying their code. Testing is so important that some developers write tests before anything else, in a methodology known as test-driven development. Learn the fundamentals of test-driven development, and the popular JavaScript testing library, Mocha. This course focuses on unit tests, which are the smallest and most specific types of software tests. After this course, you’ll be ready to test larger software features with the course. Gain high-demand expertise in: 1) Why Test? - Manual Testing,, Automated Testing, The Test Suite, Tests As Documentation, Regression; 2) Write Good Tests With Mocha - learn to use the Mocha framework and the Node.js assert library to write, automate, and organize tests in Javascript - install Mocha I, Install Mocha II, describe and it blocks, assert, Setup, Exercise, and Verify, Teardown, Hooks; and 3) Learn Test-Driven Development With Mocha - learners will practice test driven development to create their own JavaScript testing suite using Mocha.js. - Getting Into The Red I, Red To Green I, Refactor I, Getting into the Red II, Red to Green II, Refactor II and Edge Case

Sign-up today (teams & execs welcome): https://fxo.co/Ccqz  


Much career success, Lawrence E. Wilson - Online Learning Central


Wednesday, October 27, 2021

Statistics for Data Science Applications Specialization

Colleagues, the Statistical Modeling for Data Science Applications Specialization will equip you to build your Statistical Skills for Data Science. Master the Statistics Necessary for Data Science. Learn to Correctly analyze and apply tools of regression analysis to model relationship between variables and make predictions given a set of input variables., Successfully conduct experiments based on best practices in experimental design.and Use advanced statistical modeling techniques, such as generalized linear and additive models, to model a wide range of real-world relationships. Gain high demand skills in Linear Models, R Programming, Statistical Models, Regresion, Calculus and probability theory, and Linear Algebra.. The three courses in this Specialization include: 1) Modern Regression Analysis in R - provides a set of foundational statistical modeling tools for data science. In particular, students will be introduced to methods, theory, and applications of linear statistical models, covering the topics of parameter estimation, residual diagnostics, goodness of fit, and various strategies for variable selection and model comparison. Attention will also be given to the misuse of statistical models and ethical implications of such misuse., 2) ANOVA and Experimental Design - This second course in statistical modeling will introduce students to the study of the analysis of variance (ANOVA) - provides the mathematical basis for designing experiments for data science applications. Emphasis will be placed on important design-related concepts, such as randomization, blocking, factorial design, and causality. Some attention will also be given to ethical issues raised in experimentation, and 3) Generalized Linear Models and Nonparametric Regression - study a broad set of more advanced statistical modeling tools. Such tools will include generalized linear models (GLMs), which will provide an introduction to classification (through logistic regression); nonparametric modeling, including kernel estimators, smoothing splines; and semi-parametric generalized additive models (GAMs). Emphasis will be placed on a firm conceptual understanding of these tools. Attention will also be given to ethical issues raised by using complicated statistical models.

Sign-up today (teams & execs welcome): https://tinyurl.com/3unpx3sb 


Much career success, Lawrence E. Wilson - Online Learning Central

Tuesday, October 26, 2021

Cloud Native Application Architecture

Colleagues, this  Cloud Native Application Architecture program help you meet the growing demand for cloud native architects and learn to identify the best application architecture solutions for an organization’s needs. Master the skills necessary to become a successful cloud native architect. Learn to run and manage scalable applications in a cloud native environment, using open source tools and projects like ArgoCD, gRPC, and Grafana. The training modules, each with a hands-on project, include: 1) Cloud Native Fundamentals - learn how to structure, package, and release an application to a Kubernetes cluster, while using an automated CI/CD pipeline. Students will start by applying a suite of good development practices within an application, package it with Docker and distribute it through DockerHub. This will transition to the exploration of Kubernetes resources and how these can be used to deploy an application.Towards the end of the course, students will learn the fundamentals of Continuous Integration and Continuous Delivery (CI/CD) with GitHub Actions and ArgoCD and completely automate the release process for an application (Project: TechTrends); 2) Message Passing - learn how to refactor microservice capabilities from a monolithic architecture, and employ different forms of message passing in microservices. To begin, students will create a migration strategy to refactor a service from a monolith to its own microservice and implement the migration. Next, students will be introduced to industry standard best practices for message passing in a service architecture (Project - Refactor Udaconnect); 3) Observability - observability in distributed systems. To be effective as an observability expert, it is critical to understand how to monitor and respond to the health and performance of both your Kubernetes clusters and the applications hosted on them. This course will teach students how to collect system performance data using Prometheus (Project: Building a  Metrics Dashboard); 4) Microservices Security - harden a Docker and Kubernetes microservices architecture. To begin, students will learn STRIDE to threat model and reason about microservice security. Next, students will dig deep to explore the Docker and Kubernetes attack surface and be introduced to industry open-source tools such as Docker-bench and Kube-bench to evaluate and harden Docker and Kubernetes weaknesses. Students will then learn about software composition analysis with Trivy and Grype to evaluate image layers and common application security vulnerabilities and provide remediation. (Project: Hardened Microservices Environment); and 5) Capstone Project - Uda CityShop - evaluate the costs of products in different currencies and follow the recommendations of browsing products with variate discount rates based on ads.

Sign-up today (teams & execs welcome): https://tinyurl.com/yn2w48vy 


Much career success, Lawrence E. Wilson - Online Learning Central


Monday, October 25, 2021

Become a Data Architect


Colleagues, in the Become a Data Architect you will learn how to plan, design and implement enterprise data infrastructure solutions. Create the blueprints for an organizationÕs data management system.Meet the growing demand for full stack developers and learn to build rich web experiences using a modern architecture and technology stack. Plan, design and implement enterprise data infrastructure solutions and create the blueprints for an organization’s data management system. You’ll create a relational database with PostGreSQL, design an Online Analytical Processing (OLAP) data model to build a cloud based data warehouse, and design scalable data lake architecture that meets the needs of Big Data. Training modules and hands-on projects cover: 1) Data Architecture Foundations - learn to design a data model, normalize data, and create a professional ERD. Finally, you will take everything you learned and create a physical database using PostGreSQL (Project: Design and HR Database); 2) Designing Data Systems - learn to design OLAP dimensional data models, design ELT data processing that is capable of moving data from an ODS to a data warehouse, and write SQL queries for the purpose of building reports (Project: Design a Data Warehouse for Reporting & OLAP); 3) Big Data Systems - learn about the internal architecture of many of the Big Data tools such as HDFS, MapReduce, Hive and Spark, and how these tools work internally to provide distributed storage, distributed processing capabilities, fault tolerance and scalability. Next you will learn how to evaluate NoSQL databases, their use cases and dive deep into creating and updating a NOSQL database with Amazon DynamoDB. Finally, you will learn how to implement Data Lake design patterns and how to enable transactional capabilities in a Data Lake (Project: Design an Enterprise Data Lake System); and 4) Data Governance - learn about the different types of metadata, and how to build a Metadata Management System, Enterprise Data Model, and Enterprise Data Catalog. Finally, you will learn the concepts of Master Data and golden record, different types of Master Data Management Architectures, as well as the golden record creation and master data governance processes (Project: Data Governance at Sneakerpeak). Intermediate Python and SQL skills and familiarity with ETL/Data Pipelines are needed.

Sign-up today (teams & execs welcome): https://tinyurl.com/2f6fkf6v 


Much career success, Lawrence E. Wilson - Online Learning Central


Saturday, October 23, 2021

Become a Data Architect

Colleagues, in the Become a Data Architect you will learn how to plan, design and implement enterprise data infrastructure solutions. Create the blueprints for an organizationÕs data management system.Meet the growing demand for full stack developers and learn to build rich web experiences using a modern architecture and technology stack. Plan, design and implement enterprise data infrastructure solutions and create the blueprints for an organization’s data management system. You’ll create a relational database with PostGreSQL, design an Online Analytical Processing (OLAP) data model to build a cloud based data warehouse, and design scalable data lake architecture that meets the needs of Big Data. Finally, you’ll learn how to apply the principles of data governance to an organization’s data management system. Training modules and hands-on projects cover: 1) Data Architecture Foundations - learn to design a data model, normalize data, and create a professional ERD. Finally, you will take everything you learned and create a physical database using PostGreSQL (Project: Design and HR Database); 2) Designing Data Systems - learn to design OLAP dimensional data models, design ELT data processing that is capable of moving data from an ODS to a data warehouse, and write SQL queries for the purpose of building reports (Project: Design a Data Warehouse for Reporting & OLAP); 3) Big Data Systems - learn about the internal architecture of many of the Big Data tools such as HDFS, MapReduce, Hive and Spark, and how these tools work internally to provide distributed storage, distributed processing capabilities, fault tolerance and scalability. Next you will learn how to evaluate NoSQL databases, their use cases and dive deep into creating and updating a NOSQL database with Amazon DynamoDB. Finally, you will learn how to implement Data Lake design patterns and how to enable transactional capabilities in a Data Lake (Project: Design an Enterprise Data Lake System); and 4) Data Governance - learn about the different types of metadata, and how to build a Metadata Management System, Enterprise Data Model, and Enterprise Data Catalog. Finally, you will learn the concepts of Master Data and golden record, different types of Master Data Management Architectures, as well as the golden record creation and master data governance processes (Project: Data Governance at Sneakerpeak). Intermediate Python and SQL skills and familiarity with ETL/Data Pipelines are needed.

Sign-up today (teams & execs welcome): https://tinyurl.com/2f6fkf6v 


Much career success, Lawrence E. Wilson - Online Learning Central


Thursday, October 21, 2021

Machine Learning Engineer for Microsoft Azure

Colleagues, the Machine Learning Engineer for Microsoft Azure program will strengthen your machine learning skills and build practical experience by training, validating, and evaluating models using Azure Machine Learning. Students will enhance their skills by building and deploying sophisticated machine learning solutions using popular open source tools and frameworks, and gain practical experience running complex machine learning tasks using the built-in Azure labs accessible inside the Udacity classroom. The three skill-based training modules with a hands-on project include: 1) Using Azure Machine Learning - Machine learning is a critical business operation for many organizations. Learn how to configure machine learning pipelines in Azure, identify use cases for Automated Machine Learning, and use the Azure ML SDK to design, create, and manage machine learning pipelines in Azure (Project - Optimizing an ML Pipeline in Azure)., 2) Machine Learning Operations - operationalizing machine learning, from selecting the appropriate targets for deploying models, to enabling Application Insights, identifying problems in logs, and harnessing the power of Azure’s Pipelines. All these concepts are part of core DevOps pillars that will allow you to demonstrate solid skills for shipping machine learning models into production (Project - Operationalizing Machine Learning), and 3) Capstone Project - use the knowledge you have obtained from this Nanodegree program to solve an interesting problem. You will have to use Azure’s Automated ML and HyperDrive to solve a task. Finally, you will have to deploy the model as a web service and test the model endpoint. Prior experience with Python, Machine Learning, and Statistics is recommended.

Sign-up today (teams & execs welcome): https://tinyurl.com/2pe8hrvj 


Much career success, Lawrence E. Wilson - Online Learning Central


Wednesday, October 20, 2021

Essentials of Linux System Administration (LFS201)

OSS colleagues, the Essentials of Linux System Administration (LFS201) offered by the Linux Foundation will equip you to administer the #1 operating system for web servers, cloud computing, smart phones and consumer electronics. Due to its high adoption rates and continued growth, there’s a shortage of Linux system administrators. This course will teach you the skills and processes you need to work as a professional Linux systems administrator. Learn how to administer, configure and upgrade Linux systems running one of the three major Linux distribution families (Red Hat, SUSE, Debian/Ubuntu). Skill-based training modules include: 1) Linux Filesystem Tree Layout, 2) Processes, 3) Signals, 4) Package Management Systems, 5) RPM, 6) DPKG, 7) yum, 8) zypper, 9) APT, 10) System Monitoring, 11) Process Monitoring, 12) Memory: Monitoring Usage and Tuning, 13) I/O Monitoring and Tuning, 14) I/O Scheduling, 15)  Linux Filesystems and the VFS, 16) Disk Partitioning, 16) Filesystem Features: Attributes, Creating, Checking, Mounting, 17) Filesystem Features: Swap, Quotas, Usage, 18) Th ext2/ext3/ext4 Filesystems, 19) The XFS and btrfs Filesystems, 20) Encrypting Disks, 21) Logical Volume Management (LVM), 22) RAID, 23) Kernel Services and Configuration, 24) Kernel Modules, 25) Devices and udev, 26) Virtualization Overview, 27) Virtualization Overview, 28) Containers Overview, 29) User Account Management, 30) Group Management, 31) File Permissions and Ownership, 32)  Pluggable Authentication Modules (PAM), 33) Network Addresses, 34) Network Devices and Configuration, 35) Firewalls, 36) System Startup and Shutdown, 37) GRUB, 38) System Init: systemd, SystemV and Upstart, 39) Backup and Recovery MethodsLinux Security Modules, 40) Local System Security, 41) Basic Troubleshooting, and 42) System Rescue.

Enroll today (teams & execs welcome): https://tinyurl.com/2kjpxyz3 


Much career success, Lawrence E. Wilson -  Online Learning Central

Monday, October 18, 2021

JavaScript Developer (Training)

Dev colleagues, the JavaScript Developer program will train you to use JavaScript, jQuery, and Vue.js and includes 180 course hours. You will start the course by mastering JavaScript. You will then be introduced to React and learn the latest React techniques to build an application. Next, you will learn Vue.js, beginning with a simple "Hello, Vue!" app that takes you through the process of creating a small, but featureful math app. Finally, you will learn jQuery to maintain and modernize existing websites that use jQuery. When your coursework is complete, you will put your skills to use to build your own dynamic web application. Learn the basic constructs of JavaScript, how to use JavaScript to access and change page elements and their properties, and how to test and debug JavaScript using Google Chrome, additional, advanced JavaScript techniques that prepare you for learning specific frameworks, Vue and React frameworks for building applications and Use jQuery to maintain and modernize existing websites. Gain intensive JavaScript skills to jump-start a career in a growing technical field, master Vue.js and React.js and be able to maintain legacy work with jQuery methods. This program is self-paced, online course, 6 Months to complete and has open enrollment,- begin anytime

Sign-up today (teams & execs welcome): https://tinyurl.com/bvrbjnkv 


Much career success, Lawrence E. Wilson - Online Learning Central


Thursday, October 14, 2021

Hypothesis Testing with Python

Colleagues, the Hypothesis Testing with Python program enables you to plan, implement, and interpret a hypothesis test in Python. Hypothesis testing is used to address questions about a population based on a subset from that population. For example, A/B testing is a framework for learning about consumer behavior based on a small sample of consumers. This course assumes some preexisting knowledge of Python, including the NumPy and pandas libraries. Acquire high-demand, marketable skills in 1) Introduction to Hypothesis Testing - Find out what you’ll learn in this course and why it’s important, 2) Hypothesis testing: Testing a Sample Statistic - Learn about hypothesis testing and implement binomial and one-sample t-tests in Python, 3) Hypothesis Testing: Testing an Association - Learn about hypothesis tests that can be used to evaluate whether there is an association between two variables, 4) Experimental Design - Learn to design an experiment to make a decision using a hypothesis test, and 5) Hypothesis Testing Projects - Practice your hypothesis testing skills with some additional projects.

Enroll today (eams & execs welcome):https://fxo.co/CZIg 


Much career success, Lawrence E. Wilson - Online Learning Central

Wednesday, October 13, 2021

Learn Git (Training)

Dev colleagues, the Learn Git program equips you to something that developers use to save all relevant versions of their work to avoid moments like those. Git also makes it easy for developers to collaborate and share work with others. Git, simply put, is a tool to save versions of your code. This course will teach you a basic workflow and Git’s core features, different ways to undo changes or save multiple versions of a project, and how to collaborate with other developers. Gain hands-on skills including:  1) Basics of Workflow - Hello Git, Git Nit, Git Workflow, Git Status, Git Add, Git Diff, Git Log, Git Commit, and Git Generalizations; and 2) How to Backtrack - Backtracking ?Introduction, Head Commit, Git Checkout,  Get Reset I & II, Git Reset Review and Generalizations. 

Sign-up today (teams & execs welcome): https://fxo.co/CcrC 


Much career success, Lawrence E. Wilson - Online Learning Central


Monday, October 11, 2021

Learn Linear Regression in R

Dev colleagues, the Learn Linear Regression in R equips you to o implement linear regression using the R programming language. Linear regression models are used in machine learning, so this course serves as an introduction to the topic as well. R is used by professionals in the Data Analysis and Data Science fields as part of their daily work. Take-Away Skills - learn how to make linear regression models using R.  Skill-based training modules include: 1) Introduction to Linear Regression in R - Linear Regression is the workhorse of applied Data Science; it has long been the most commonly used method by scientists and can be applied to a wide variety of datasets and questions, 2) Assumptions of Simple Linear Regression - While the linear regression is perhaps the most widely applied method in Data Science, it relies on a strict set of assumptions about the relationship between predictor and outcome variables, 3) Assumptions of Linear Regression (Outliers - Our next step is to check for outlier data points. Linear regression models also assume that there are no extreme values in the data set that are not representative of the actual relationship, 4) Building a Simple Linear Model - Simple linear regression is not a misnomer–– it is an uncomplicated technique for predicting a continuous outcome variable, Y, on the basis of just one predictor variable, X, 5) Quantifying Model Fit - Once we have an understanding of the kind of relationship our model describes, we want to understand the extent to which this modeled relationship actually fits the data., 6) Checking Model Residuals, 7) Visualizing Model Fit - it is alway a best practice to produce visual summaries to assess our model., 7) Reading Model Results - We’ve done our due diligence and confirmed that our data fulfills the assumptions of simple linear regression models, 8) Assessing Simple Linear Regression - Let’s practice our model interpretation skills! We know that for continuous independent variables, like podcasts, the regression coefficient, 9) Making Predictions - Data Scientists are often interested in building models to make predictions on new data. While the add_predictions() function from the modelr package, 10) Multiple Linear Regression - the results of simple linear regression models and show how the results convey a substantial amount of information about the relationship between two variables, 11) Assessing Multiple Linear Regression.

Sign-up today (teams & execs welcome): https://fxo.co/Ccqr 


Much career success, Lawrence E. Wilson - Online Learning Central


Wednesday, October 6, 2021

Full Stack Software Developer (Training)

Dev colleagues, the Full Stack Software Developer program teaches you how to create and maintain full-service websites. You will learn all the significant aspects of front-end, back-end, and full-stack development through several milestone exercises and a hands-on project. Throughout the course, you will build a website that hosts learning games. By course completion, your website will allow users to log in, play games, track their progress, see leaderboards, and manage their accounts. It includes 650 course-training hours. Learn to Build a fully functioning database-driven website from the ground up, Use modern JavaScript libraries to make single-page web applications and Build a secure user authentication system and to manage user data. High-demand, skill-based training modules include: 1) Things Every Developer Should Know, 2) Introduction to JavaScript, 3) Advanced JavaScript Concepts, 4) Creating, Styling, and Validating Web Forms, 5) Vue.js, 6) Bootstrap Training, 7) PostgreSQL, 8) Python - Introduction and Advanced, 9) Django for Python Developers, 10) Introduction to WordPress Training, and 11) Capstone Project. 

Sign-up today (teams & execs welcome): https://tinyurl.com/2vz47n38 


Much career success, Lawrence E. Wilson - Online Learning Central


Monday, October 4, 2021

FinOps Certified Practitioner (FOCP) - Linux Foundation

OSS colleagues, the  FinOps Certified Practitioner - FOCP (aka “Cloud Financial Operations”) certification program from the Linux Foundation  is the practice of bringing financial accountability to the variable spend model of cloud, enabling distributed teams to make business trade-offs between speed, cost, and quality. The certification covers FinOps fundamentals and an overview of key concepts in each of the three sections of the FinOps lifecycle: Inform, Optimize and Operate. Skill-based training is mapped to the cert exam objectives: 1) Challenge of Cloud (8%) - Understand the challenges of working with Cloud, Recognize the differences between Cloud and Traditional IT, 2) What is FinOps & FinOps Principles (12%) - Define FinOps, Understand each of the FinOps Principles, 3) FinOps Teams & Motivation (12%) - Describe the skills, roles and responsibilities of a FinOps Team, Describe where a FinOps team would be situated, Understand what drives a FinOps team’s size, position and makeup, 3) FinOps Capabilities (28%) - Understand the six pillars of FinOps Capabilities, Describe the activities associated with each capability, 4) FinOps Lifecycle (30%) - Describe the FinOps lifecycle, its phases and purpose, Understand the basic processes that FinOps teams enact in the FinOps Lifecycle, and 5) Terminology & the Cloud Bill (10%) - Understand basic cloud, FinOps, DevOps & Finance terminology and Describe the characteristics of the Cloud Bills. Exam Details & Resources: This is an online non-proctored exam that can take up to 60 minutes to complete. It includes 50 multiple choice questions, some with multiple selections as indicated in the question text.

Register today (teams & execs welcome): https://tinyurl.com/dt6x2rpy 


Much career success, Lawrence E. Wilson -  Online Learning Central


Thursday, September 30, 2021

Deploying and Managing Linux on Azure

OSS Cloud colleagues, the Deploying and Managing Linux on Azure program will open up new career and income opportunities for you while equipping you with high-demand, marketable skills. The program offers over 11 hours of video instruction that will help you learn everything you need to know to deploy Linux in Microsoft Azure cloud. Deploying Linux on Azure LiveLessons is a unique video product that teaches you everything you need to know to get up and running with Linux if you are a Microsoft Professional who needs to deploy, manage, and monitor Linux in an Azure environment. This comprehensive video course starts with the essentials, and then quickly moves forward from there, focusing on tasks that are specific for managing Linux workloads in an Azure environment. Content includes managing containers, monitoring Linux, working with configuration management tools, and working with Azure-specific services for managing Linux. Training modules address: 1) Linux and Azure Fundamentals,  2) Advanced Linux Administration, 3) Using Containers, 4) Deploying Linux in Azure Cloud, 5)  Configuration Management Solutions, 6) Puppet, 7)  Ansible, 8)  Chef, 9) Salt, 10) Linux in Azure Troubleshooting, 11) Nagios, and 12)  Zabbix for device monitoring. 

Register today (teams & execs welcome): https://tinyurl.com/28xcnrex 


Much career success, Lawrence E. Wilson -  Online Learning Central