This course is for those new to the Elastic Stack to get an introductory overview of its core services (Elasticsearch, Logstash, Kibana, Beats), features, terms, and basic administration. This course will follow a real-world use case of setting up a log aggregation pipeline for web access logs and analyzing said logs with Kibana via search, visualizations, and dashboards.
If you are looking for "ELK Stack" material, this is the place! Now that Elastic Stack is more than ElasticSearch, LogStash and Kibana, the "ELK" naming convention has been retired!
Big Data Essentials is a comprehensive introduction to the world of Big Data. Starting with the definition of Big Data, we describe the various characteristics of Big Data and its sources. Using real world examples, we highlight the growing importance of Big Data. We discuss architectural requirements and principles of Big Data infrastructures and the intersection of cloud computing with Big Data. We also provide an overview of the most popular Big Data technologies including core Hadoop, the Hadoop ecosystem (Hive, Pig, Sqoop, Flume, Kafka, Storm, Ambari, Oozie, Zookeeper), NoSQL databases and Apache Spark. We conclude this lesson with a tour of the different types of Analytics that can be performed on Big Data and various techniques and tools used.
Hadoop has become a staple technology in the big data industry by enabling the storage and analysis of datasets so big that it would be otherwise impossible with traditional data systems. In this course, we are going to jump right into deploying Hadoop, configuring HDFS, and executing MapReduce jobs. Lastly, you will get to try it all out yourself with a guided hands-on learning activity. So let's get started!
Download the Hadoop Quick Start Interactive Guide here: https://interactive.linuxacademy.com/diagrams/HadoopQuickStart.html
Linux Academy has many innovative tools and services, like your own cloud lab, that cannot be found anywhere else. The robust offering we have developed, which is focused around student success, and how to use that offering can be found here within this course. Anybody looking to maximize their learning and to maximize the value they receive from a Linux Academy membership should go through this course.
Linux Academy provides more content, hands-on labs, your own cloud lab, and more value per dollar than any other training provider.
This course is a replacement for the older "Introduction To Linux Academy" course.
In this course, we will start with the basics of what a database is, then we will explore the different types of databases from flat file to relational. We will install some of the more popular database systems that are available on Linux and see how to work with data in those systems.
Interactive Diagram: https://interactive.linuxacademy.com/diagrams/databaseessentials.html
Follow right on the heels of the Elastic Stack Essentials course with the Elasticsearch Deep Dive. Get to understand and go hands-on with the core functionality of Elasticsearch (installing, indexing, querying). Next, learn how to configure it for production use with TLS encryption, user access control, monitoring, and alerting with X-Pack and automated management with Elasticsearch Curator. Get to understand best practices around heap and cluster sizing, hardware requirements, and performing live upgrades.
In this course, you will develop the skills that you need to write effective and powerful scripts and tools using Python. We will go through the necessary features of the Python language to be able to leverage its additional benefits in writing scripts and creating command line tools (data types, loops, conditionals, functions, error handling, and more). Beyond the language itself, you will go through the full development process including project set up, planning, and automated testing to build two different command line tools.
As one of the early log aggregation products in the IT industry, Splunk has remained a popular choice amongst system administrators, engineers, and developers for operational analytics. Whether you are aggregating log files, system resource utilization metrics, or application data, Splunk is there to centralize your IT data for easy search and visualization.
This course serves as in introduction to Splunk Enterprise. After getting familiar with some basic terminology and components, you will get to follow along by setting up your own standalone Splunk instance through the Linux Academy Cloud Playground. With your own instance, you can follow along as we secure our standalone Splunk instance, configure monitoring and alerting, and finally index some log data to perform search and visualization analysis.
This course is designed to help you learn and develop the requisite skills to pass the Microsoft Azure DP-200 certification exam. This exam measures your ability to accomplish the following technical tasks:
We will be deep diving into these topics to provide the background you need to pass this exam and better understand the Azure world of big data and analytics.
Interactive Diagram: https://interactive.linuxacademy.com/diagrams/DP200.html
Special Note: This course is being brought to you via Early Access. Check out the Introduction section for a video on how to be sure you get the most out of this preview of the course!
Big data technologies are some of the most exciting and in-demand skills. These tools power large companies such as Google and Facebook and it is no wonder AWS is spending more time and resources developing certifications, and new services to catalyze the move to AWS big data solutions.
This course will provide you with much of the required knowledge needed to be prepared to take the AWS Big Data Specialty Certification. We will cover the different AWS (and non-AWS!) products and services that appear on the exam. Importantly - we will not cover material you should already have a solid understanding of such as AWS Identity and Access Management, and global infrastructure. For those foundational concepts, definitely review the AWS Certified Developer - Associate Level course here on Linux Academy.
Access The Data Dispatch: https://interactive.linuxacademy.com/diagrams/thedatadispatch.html
Join the Linux Academy community slack for chat here: https://inuxacademy-community-slack.herokuapp.com/ and join the #containers channel.
The Google Cloud Professional Data Engineer is able to harness the power of Google's big data capabilities and make data-driven decisions by collecting, transforming, and visualizing data. Through designing, building, maintaining, and troubleshooting data processing systems with a particular emphasis on the security, reliability, fault-tolerance, scalability, fidelity, and efficiency of such systems, a Google Cloud data engineer is able to put these systems to work.
This course will prepare you for the Google Cloud Professional Data Engineer exam by diving into all of Google Cloud's data services. With interactive demonstrations and an emphasis on hands-on work, you will learn how to master each of Google's big data and machine learning services and become a certified data engineer on Google Cloud.
Download the Data Dossier: https://interactive.linuxacademy.com/diagrams/TheDataDossier.html
Elasticsearch has become a favorite technology of administrators, engineers, and developers alike. Whether you are using it with the rest of the Elastic Stack, or on its own, Elasticsearch is a powerful and user-friendly search and analytics engine. Log aggregation, operational analytics, application performance monitoring, NoSQL databases, site search, and ad-hoc data analysis are just a few of the many things Elasticsearch is used for. The Elastic Certified Engineer Certification was created to recognize IT professionals with expertise in Elasticsearch. An Elastic Certified Engineer can design and deploy a complete Elasticsearch solution.
DISCLAIMER: This is not an official Elastic Company course created by or approved by the Elastic Company. The Linux Academy is in no away affiliated with Elastic Company or ElasticSearch BPV.
Welcome to Linux Academy's all new AWS Certified Machine Learning - Specialty prep course. This course prepares you to take the AWS Certified Machine Learning - Specialty (MLS-C01) certification exam. It also gives you the hands-on experience required to use machine learning and deep learning in a real-world environment.
This course starts off with coming to grips with Machine Learning (ML), Deep Learning (DL), and Artificial Intelligence (AI) terminology. After the theory comes the practice. You'll get hands-on with a number of ML frameworks and AWS services specific to the certification.
Please note, this is a preview of this course, so not all the content is available yet. New updates will be made from time to time. In the meantime, please join the conversation in the Linux Academy Community Slack where we have a special channel to focus on the development of this course.
Join the Linux Academy community Slack here: https://slack.linuxacademy.com/ and specifically the #aws-mls-c01-2019 channel.
Make sure you’re staying up-to-date with the course as it progresses. Watch the intro video for instructions.