Edge Computing and Predictive Analytics Architect
Course title: Distributed Systems & Cloud Computing with Java
Target group: Mid Level Employee
Level: Extended Know-How
Distributed Systems & Cloud Computing with Java
Provider
Udemy
Description
In this course you will:
- Master the theory of Distributed Systems, Distributed Computing and modern Software Architecture
- Gain the practical skills necessary to build Distributed Applications and Parallel Algorithms, focusing on Java based technologies
- Deploy groups of distributed Java applications on the Cloud
- Scale Distributed Databases to store petabytes of data
- Build Highly Scalable and Fault Tolerant Distributed Systems
Along the way, you will learn modern technologies like:
- Apache Kafka
- Apache Zookeeper
- MongoDB
- HAProxy
- JSON
- Java HTTP Server and Client
- Protocol Buffers
- Google Cloud Platform
- And many others
By the end of the course you will:
- Apply best practices for building and architecting real-life Distributed Systems
- Scale your Distributed System to handle billions of transactions per day
- Deploy your distributed application on the Cloud
- Choose the right technologies for your use case and Software Architecture
Use modern Java based techniques to store and handle large amounts of data
Target
- Students who want to build modern, Distributed Systems at scale
- Students who want to acquire new practical skills in Distributed Computing & Cloud Technologies
- Students proficient in Java who want to take their Software Engineering skills to a new level
- Software developers and engineers
- IT professionals involved in distributed systems and cloud computing
- Computer science students specializing in distributed computing
- Architects designing scalable software architectures
Sector
- Information Technology and Software Development
- Cloud computing services
- Data management and storage
- Enterprise software solutions
Area
- Distributed Systems and Computing
- Cloud Platforms (particularly Google Cloud Platform)
- Big Data and Databases (MongoDB, distributed databases)
- Modern Software Architecture and Design
Learning outcomes
- Design and build massively Parallel Java Applications and Distributed Algorithms at Scale
- Create efficient Cloud-based Software Systems for Low Latency, Fault Tolerance, High Availability and Performance
- Master Software Architecture designed for the modern era of Cloud Computing
- Globally deploy Distributed Programs on the Cloud serving millions of users, billions of requests, & petabytes of data
Learning content
- Introduction to Distributed Systems
- Introduction and Motivation
- Cluster Coordination Service and Distributed Algorithms
- Cluster Management, Registration and Discovery
- Network Communication
- Building Distributed Document Search
- Load Balancing
- Distributed Message Brokers
- Distributed Storage & Databases
- Scaling a Real Database – Distributed MongoDB
- Cloud Computing and Deployment at Global Scale
- Bonus Material
Approach/method
Online
Duration
7.5 hours on-demand video
Assessment
No
Certification
Yes
Date
Always available
Location
Online
Website
Course title: Cloud-Native Development with OpenShift and Kubernetes Specialization
Target group: Mid Level Employee
Level: Foundations
Cloud-Native Development with OpenShift and Kubernetes Specialization
Provider
Coursera
Description
This specialization is intended for application developers, system administrators and architects seeking to develop their understanding of container technology. In this three-course specialization, you will explore a range of foundational concepts from various use cases for containerized technology to understanding the differences between Kubernetes and Red Hat OpenShift to learning how to scale deployed applications.
Applied Learning Project These courses include detailed hands-on exercises (step-by-step guides) which teach a range of concepts from the foundation concepts of containerized applications, to deploying containerized applications to Kubernetes and OpenShift. After completing this specialization, you should have a foundational understanding of how to develop, deploy, scale and troubleshoot containerized applications to Kubernetes and OpenShift.
Target
- Application developers, system administrators, architects (entry to mid-level)
- Application developers, system administrators, architects (intermediate)
- Application developers, system administrators, architects (advanced to practical engineers)
Sector
- Information Technology
- Cloud Computing
- Software Development
- IT Operations
- DevOps
- Cloud-Native Platforms
Area
- Container fundamentals
- Use cases for containerized technology
- Kubernetes concepts
- Red Hat OpenShift differences
- Deploying containerized applications
- Scaling deployed containers
- Troubleshooting containerized applications
- DevOps and cloud-native operations
- Observability and monitoring of containerized workloads
- Application lifecycle management in Kubernetes/OpenShift
Learning outcomes
- Learn in-demand skills from university and industry experts
- Master a subject or tool with hands-on projects
- Develop a deep understanding of key concepts
- Earn a career certificate from Red Hat
- Understand the foundational concepts of containerized applications. Learn basic skills of how to develop, deploy, scale and troubleshoot containerized applications to Kubernetes and OpenShift.
Learning content
- Course 1: Foundations of Red Hat Cloud-native Development
- Course 2: Managing Cloud-native Applications with Kubernetes
- Course 3: Advanced Application Management with Red Hat OpenShift
Approach/method
Online
Duration
4 weeks to complete at 10 hours a week
Assessment
No
Certification
Yes
Date
Always available
Location
Online
Website
Course title: Deploy Azure IoT Edge devices and modules
Target group: Mid Level Employee
Level: Extended Know-How
Deploy Azure IoT Edge devices and modules
Provider
Classcentral
Description
Module 1: This module introduces you to Azure IoT Edge and the benefits of bringing cloud-compute capabilities to the device, the IoT Edge runtime modules and module twin properties, and IoT Edge security manager implementation.
After you complete this module, you will be able to:
- Describe the features and capabilities of Azure IoT Edge.
- Describe the IoT Edge runtime and modules.
- Describe IoT Edge security and certificates.
Module 2: This module introduces you to IoT Edge device deployment concepts, the IoT Edge deployment manifest, and other considerations when preparing for IoT Edge device deployments.
After you complete this module, you will be able to:
- Describe Azure IoT Edge deployment concepts.
- Describe the IoT Edge deployment manifest.
- Describe pre-deployment considerations.
Module 3: This module introduces you to the IoT Edge device gateway patterns, configuring support for downstream devices and device authentication, and details for configuring a transparent gateway device that has child devices.
After you complete this module, you will be able to:
- Describe the IoT Edge device gateway patterns.
- Describe how to authenticate the devices that are connected to a gateway device.
- Describe the configuration of a transparent gateway device.
Module 4: This module provides you with experience deploying, configuring, and running IoT Edge devices and IoT Edge modules in a test environment.
After you complete this module, you will be able to:
- Deploy an Azure IoT Edge enabled Linux VM.
- Create an IoT Edge device identity in IoT Hub using Azure CLI.
- Connect the IoT Edge device to IoT Hub.
- Deploy an IoT Edge module that acts as a temperature sensor.
- Deploy Azure Stream Analytics module that analyzes temperature data on the IoT Edge device.
Module 5: This module provides you with experience deploying IoT Edge devices, configuring IoT Edge devices as gateway and downstream child devices, and communicating child device messages to IoT Hub using the gateway device.
After you complete this module, you will be able to:
- Deploy an Azure IoT Edge Enabled Linux VM as an IoT Edge Device.
- Configure the IoT Edge device as a transparent gateway and connect it to IoT Hub.
- Configure the IoT Edge gateway device for communication with downstream IoT devices.
Create a downstream IoT device and configure its connection to the gateway device.
Target
- IoT engineers, cloud engineers, and IT professionals responsible for edge computing implementations
- IoT solution architects, deployment engineers, and IT administrators planning edge deployments
- Systems integrators, network engineers, and security/compliance specialists configuring gateways
- Field engineers, DevOps/DevSecOps teams, and test engineers validating edge deployments
- System integrators, gateway administrators, and solution architects implementing gateway with downstream devices
Sector
- Manufacturing
- Logistics
- Energy
- Healthcare
- Smart facilities
- Industrial sectors deploying large-scale IoT Edge ecosystems
- Any industry adopting IoT Edge for local data processing
Area
- Azure IoT Edge concepts, runtime modules, and edge security
- IoT Edge deployment concepts, deployment manifest, and pre-deployment considerations
- IoT Edge device gateway patterns, downstream device authentication, and transparent gateway configuration
- Deploying and running IoT Edge on Linux VMs, IoT Hub identity, and edge modules (temperature sensor) and analytics
- Deploying IoT Edge as gateway, configuring transparent gateway, downstream device integration with IoT Hub
Learning outcomes
- Explain the features, capabilities, and benefits of Azure IoT Edge for bringing cloud-compute to devices.
- Describe the IoT Edge runtime, modules, module twin properties, and security mechanisms, including certificates and module identity.
- Understand IoT Edge deployment concepts, create and interpret deployment manifests, and plan for device deployment.
- Identify and configure IoT Edge gateway patterns, including authentication and transparent gateway setup for downstream devices.
- Deploy and manage IoT Edge devices and modules in a test environment, including creating device identities and connecting devices to IoT Hub.
- Configure IoT Edge gateway devices and downstream child devices, enabling secure communication and message routing to IoT Hub.
- Implement IoT Edge solutions for real-time data collection, processing, and analytics on edge devices using modules like Azure Stream Analytics.
Learning content
- Module 1: Examine the Azure IoT Edge environment
- Module 2: Examine IoT Edge device deployment
- Module 3: Examine IoT Edge gateway device configuration
- Module 4: Explore IoT Edge module deployment
- Module 5: Explore IoT Edge gateway configuration
Approach/method
Online
Duration
3-4 hours
Assessment
No
Certification
Yes
Date
Always available
Location
Online
Website
Course title: Advanced Certificate in Edge Computing for Problem Solving
Target group: Senior Employee
Level: Extended Know-How
Advanced Certificate in Edge Computing for Problem Solving
Provider
247campus.org
Description
This course will teach you how to design, deploy, and manage edge computing systems. You’ll learn about edge computing architecture, data processing, and security, as well as how to apply these concepts to real-world problems. Whether you’re a developer, IT professional, or data scientist, this course will help you stay ahead of the curve. By the end of this course, you’ll be able to design and deploy edge computing systems that solve complex problems and drive business success. So why wait? Explore the world of edge computing today and discover a new way to process data.
Target
- Developers: professionals building and deploying edge-enabled applications.
- IT Professionals: practitioners responsible for infrastructure, deployment, and security at the edge.
- Data Scientists: analysts who process and derive insights from edge-collected data.
Sector
- Technology & Telecommunications: edge infrastructure, 5G/6G, and IoT ecosystems.
- Manufacturing & Industrial: predictive maintenance, real-time monitoring, and autonomous systems.
- Retail & Smart Cities: localized data processing for latency-sensitive applications.
- Healthcare & Public Safety: on-site data processing for privacy and immediacy.
Area
- Edge Architecture & Design: multi-tier edge topologies, deployment patterns.
- Data Processing & Analytics: real-time and near-real-time data analysis at the edge.
- Security & Compliance: edge-specific threat models, encryption, and access control.
- Deployment & Management: orchestration, lifecycle, and monitoring of edge nodes.
- Real-World Applications: use cases demonstrating end-to-end edge solutions.
Learning outcomes
- Edge Computing Fundamentals: This unit covers the basics of edge computing, including its definition, benefits, and applications. It also introduces key concepts such as fog computing, mist computing, and edge AI.
- Network Fundamentals for Edge Computing: This unit delves into the networking aspects of edge computing, including network architecture, protocols, and security measures. It also covers the role of 5G networks in edge computing.
- Edge Computing Platforms and Frameworks: This unit explores the various edge computing platforms and frameworks available, including EdgeX, AWS IoT Greengrass, and Google Cloud Edge. It also discusses their features, use cases, and deployment models.
- Edge AI and Machine Learning: This unit focuses on the application of artificial intelligence and machine learning at the edge, including computer vision, natural language processing, and predictive analytics. It also covers edge AI frameworks and tools.
- Edge Security and Privacy: This unit addresses the security and privacy concerns in edge computing, including data encryption, access control, and secure communication protocols. It also discusses the role of edge security in protecting sensitive data.
- Edge Computing Use Cases: This unit explores various use cases for edge computing, including IoT, smart cities, industrial automation, and healthcare. It also discusses the benefits and challenges of each use case.
- Edge Computing Architecture and Design: This unit covers the design principles and architecture patterns for edge computing systems, including edge gateways, edge servers, and edge storage. It also discusses the role of orchestration and management tools.
- Edge Computing and 5G Networks: This unit examines the relationship between edge computing and 5G networks, including the benefits of 5G for edge computing and the opportunities for 5G to enable new edge computing use cases.
- Edge Computing and IoT: This unit focuses on the application of edge computing in IoT scenarios, including device management, data processing, and analytics. It also discusses the role of edge computing in enabling IoT security and privacy.
- Edge Computing and Cloud Interoperability: This unit addresses the challenges and opportunities of integrating edge computing with cloud computing, including data synchronization, security, and scalability. It also discusses the role of edge computing in enabling hybrid cloud architectures.
Learning content
- Introduction to Edge Computing
- Edge Computing Architecture
- Data Processing at the Edge
- Edge Security and Privacy
- Deployment and Management
- Hands-on Projects and Case Studies
Approach/method
Online
Duration
3-4 hours per week
Assessment
No
Certification
Yes
Date
Always available
Location
Online
Website
Course title: Architecting AI Solutions ā Scalable GenAI Systems
Target group: Senior Employee
Level: Extended Know-How
Architecting AI Solutions ā Scalable GenAI Systems
Provider
Coursera
Description
This course now features Coursera Coach! A smarter way to learn with interactive, real-time conversations that help you test your knowledge, challenge assumptions, and deepen your understanding as you progress through the course. This course offers a comprehensive journey into architecting scalable and efficient Generative AI (GenAI) applications. It equips you with the skills to design, deploy, and optimize GenAI systems. The course starts by laying the foundational knowledge of GenAI, including its evolution from traditional AI to modern architectures, and dives deep into core concepts such as Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs). By exploring these models, youāll understand their vital role in enabling cutting-edge large language models (LLMs). As you progress, you’ll delve into the LGPL architecture, breaking down its componentsāGates, Pipes, and Loopsāthrough hands-on simulations. This segment helps you grasp how these elements work in synergy to build robust GenAI applications. You’ll also be introduced to best practices for building scalable systems, including containerization, load balancing, fault tolerance, and cloud-native deployment strategies. Practical lessons in infrastructure selection and deployment strategies provide a clear path toward real-world application. The course continues with a focus on building resilient GenAI applications, with essential topics like error handling, logging, monitoring, and high availability. You’ll explore advanced security concerns, disaster recovery strategies, and cost optimization techniques for building GenAI systems that are both cost-effective and highly available. With case studies and hands-on examples, youāll learn to apply these concepts in real-world scenarios like real-time trading systems and diagnostic recommendation systems. This course is ideal for professionals in AI, cloud computing, and software development who want to master the intricacies of building scalable and resilient GenAI systems. The course requires a fundamental understanding of AI concepts and programming, making it suitable for intermediate-level learners aiming to advance their skills in architecting AI-driven applications
Target
- Intermediate-to-advanced professionals in AI, cloud computing, and software development
Sector
- Technology/Software Development and IT services
Area
- GenAI fundamentals and evolution from traditional AI
- Core models: Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs) and their role in large language models (LLMs)
- LGPL architecture (Gates, Pipes, Loops) with hands-on simulations
- Scalable system design: containerization, load balancing, fault tolerance, cloud-native deployment
- Infrastructure selection and deployment strategies for real-world apps
- Resilience: error handling, logging, monitoring, high availability
- Security, disaster recovery, and cost optimization for GenAI systems
- Real-world case studies: real-time trading systems, diagnostic
Learning outcomes
- Master the foundational principles of scalable Generative AI systems and architectures.
- Gain hands-on experience with the LGPL architecture, including Gates, Pipes, and Loops.
- Learn to build resilient and fault-tolerant GenAI applications with modern infrastructure tools.
- Understand advanced topics like Explainable AI (XAI), MLOps, and emerging trends in GenAI.
Learning content
- Module 1: Introduction
- Module 2: GenAl (Generative Al) Deep Dive
- Module 3: The LGPL Architecture – Deep Dive
- Module 4: Building Scalable GenAl Applications
- Module 5: Building for Cloud-Native Deployments
- Module 6: Building Resilient GenAl Applications
- Module 7: Disaster Recovery and High Availability Strategies
- Module 8: Security Threats in GenAl Applications
- Module 9: Cost Optimization Strategies for GenAl Infrastructure
- Module 10: Advanced Topics in GenAl Application Architecture
Approach/method
Online
Duration
8 hours
Assessment
Yes
Certification
Yes
Date
Always available
Location
Online
Website