- 194
- 180 101
CONFIG LEARNING HUB
India
Приєднався 29 тра 2012
This channel has demo videos on softwares like
1. apache activemq
2. apache activemq artemis
3. apache spark
4. apache hadoop
5. infinispan
6. apache kafka
7. apache tomcat
8. apache solr
9. redis
10. apache zookeeper
11. apache mesos
12. mysql
13. virtualbox
14. vagrant
15. minikube
16. apache camel
17. chatgpt for devops.
from basic configuration to docker and kubernetes. Also has some videos on deploying the above mentioned software on docker swarm and kubernetes cluster.
Some videos describes about helm chart for deploying the application.
Like, Share And Subscribe
www.youtube.com/@configlearninghub
1. apache activemq
2. apache activemq artemis
3. apache spark
4. apache hadoop
5. infinispan
6. apache kafka
7. apache tomcat
8. apache solr
9. redis
10. apache zookeeper
11. apache mesos
12. mysql
13. virtualbox
14. vagrant
15. minikube
16. apache camel
17. chatgpt for devops.
from basic configuration to docker and kubernetes. Also has some videos on deploying the above mentioned software on docker swarm and kubernetes cluster.
Some videos describes about helm chart for deploying the application.
Like, Share And Subscribe
www.youtube.com/@configlearninghub
Mastering Sasl Plain & Ssl Setup For Apache Kafka!
SASL (Simple Authentication and Security Layer) is a framework for adding authentication support to connection-based protocols. It allows a protocol to define a mechanism for authenticating clients and optionally encrypting or securing communication between the client and the server.
In the context of Apache Kafka, SASL is used for client-server authentication, ensuring that only authorized clients can produce or consume messages. Kafka supports multiple SASL mechanisms, including PLAIN, SCRAM, GSSAPI (Kerberos), and others.
The SASL/PLAIN mechanism is one of the simplest authentication mechanisms in SASL, where the client sends a username and password in plain text to authenticate itself to the server. While this is simple, it is recommended to use it with encryption (like SSL/TLS) to ensure that the username and password are not transmitted in clear text over the network.
SASL with SSL and PLAIN Authentication
SASL With SSL Configuration With Kafka Producer and Consumer
--------------------------------------------------------------------------------------------------------
github.com/arunsrajan/tutorial/tree/main/KafkaSASL_SSL
In the context of Apache Kafka, SASL is used for client-server authentication, ensuring that only authorized clients can produce or consume messages. Kafka supports multiple SASL mechanisms, including PLAIN, SCRAM, GSSAPI (Kerberos), and others.
The SASL/PLAIN mechanism is one of the simplest authentication mechanisms in SASL, where the client sends a username and password in plain text to authenticate itself to the server. While this is simple, it is recommended to use it with encryption (like SSL/TLS) to ensure that the username and password are not transmitted in clear text over the network.
SASL with SSL and PLAIN Authentication
SASL With SSL Configuration With Kafka Producer and Consumer
--------------------------------------------------------------------------------------------------------
github.com/arunsrajan/tutorial/tree/main/KafkaSASL_SSL
Переглядів: 11
Відео
Secure Your Apache Kafka: Easy Ssl Configuration Guide
Переглядів 3735 днів тому
Steps For Configuring Apache kafka SSL 1. Create keystore and truststore for configuring SSL between server and clients respectively. Replace the KeystorePassword, KeystoreKeyPassword and TruststorePassword with more secure combination. Also modify the distinguished name -dname to your location, organization, state and country. keytool -genkeypair -keyalg RSA -alias kafka -keystore kafka.keysto...
Kubectl: The Key to Kubernetes Mastery #kubernetes #kubectl #shorts
Переглядів 2923 дні тому
kubectl is a command-line tool for interacting with the Kubernetes cluster.
Kubernetes Manifest Objects !!! #kubernetes #manifest #shorts
Переглядів 2426 днів тому
Kubernetes configuration file (kubeconfig) - explained #kubernetes #configuration #shorts
Переглядів 135Місяць тому
Submitting Deployment YAML to Kubernetes #kubernetesarchitecture #deployment #cluster #shorts
Переглядів 62Місяць тому
Kubernetes Architecture - Core Components #kubernetesarchitecture #components #shorts
Переглядів 209Місяць тому
Kubernetes: The Ultimate HA Setup (Multi-Node, Multi-Master)
Переглядів 38Місяць тому
Kubernetes: The Ultimate HA Setup (Multi-Node, Multi-Master) #kubernetes #minikube #kind #shorts #shortsfeed
Anycast vs Multicast: Choosing the Right Queue for Your Messaging System
Переглядів 43Місяць тому
Anycast vs Multicast: Choosing the Right Queue for Your Messaging System #activemq #artemis #queues #routingtype #shorts #shortsfeed #shortsvideo #youtubeshorts #shortstrending #shortsviral
Setting up a Kubernetes cluster using Minikube
Переглядів 137Місяць тому
Learn how to set up a Kubernetes cluster using Minikube! This tutorial covers: Install Minikube on Windows Start Stop & delete your cluster with profile Enable add-ons (dashboard, metrics-server) To download minikube github.com/kubernetes/minikube/releases/download/v1.34.0/minikube-windows-amd64.tar.gz Perfect for beginners & devs looking to test Kubernetes locally. Follow along and get started...
Addresses, Queues, Routing Type - ActiveMQ Artemis #activemq #shorts
Переглядів 34Місяць тому
Addresses, Queues, Routing Type - ActiveMQ Artemis #activemq #shorts
ActiveMQ Artemis Queue Types #activemq #artemis #queues #shorts
Переглядів 61Місяць тому
ActiveMQ Artemis Broker Instance Create Commands in Tamil #activemq #artemis #shorts #shortstamil
Переглядів 26Місяць тому
ActiveMQ Artemis Broker Instance Create Commands in Tamil #activemq #artemis #shorts #shortstamil
Apache ActiveMQ Artemis Broker Create #activemq #artemis #broker
Переглядів 36Місяць тому
Apache ActiveMQ Artemis Broker Create #activemq #artemis #broker Commands Replication Primary artemis.cmd create clustered replicated Backup artemis.cmd create clustered replicated backup shared store Primary artemis.cmd create clustered shared-store Backup artemis.cmd create clustered shared-store backup
ActiveMQ Artemis High Availability in Tamil - Explained #activemq #artemis #tamilshorts
Переглядів 40Місяць тому
ActiveMQ Artemis High Availability in Tamil - Explained #activemq #artemis #tamilshorts
Hello can you please setup and show how to use tls or mtls or sasl_tls protocol for authentication
Yes
This was superb! I was struggling to configure bridge, and this video was what I needed. thanks man.
Super
Super
Hi :) Do you have any ideas why I will hit Could not connect to Redis at redis1:6379: Connection refused when creating the cluster? Thanks
Please check the commands in the video description to start the redis container and how to create the cluster.
Super
Nice
Can you please also add the kafka.yaml file with env so that it will be easy to understand , thanks the video is useful
Check here github.com/arunsrajan/tutorial/tree/main/kafkaconfig-kraft
Nice
Super
sir, is there any artemis cli command to get the queue message time stamp
Check in the web console or use "artemis browse" or "artemis consumer" command options. For more information check in chatgpt.
@configlearninghub chatgpt command only showing the messages and characters not showing tinestamp
Super
Nice
Very nice
Super!
Awesome
Awesome
Super
Thanks
Nice
Super
2 backups is active at a time is it possible or not
Master and slave is meant for 1 live and 1 backup. You can configure for multiple backups also. If you want more backups and primary you can go for the colocated cluster
@@configlearninghub colocated is possible
Colocated cluster is for multiple primary and multiple backup
@configlearninghub But actually iam not use colocated cluster, I set up 1 master and 2 backups only 1 backup is live 2 backups is not in live...
Only one primary is active at a time and if primary fails any of the backup servers will become primary and other will become backup
I want set up 1 live 2 backups.
Nice 11
Nice
Super
Super
Lai bhari
Thanks for sharing. After installation, Does operator provide with web dashboard url to view queue and topic status?
Please use kubectl port-forward command for accessing the dashboard for each pod or try this artemiscloud.io/docs/tutorials/send_receive_port_forwarding/
Please reach me arunsrajan@gmail.com
Nice explain
Thanks and welcome
This is more like Spark's standalone mode on the K8s cluster, rather than Kubernetes as the cluster manager. Am I correct?
@@AnkurRanjan-rm5jd yes it a standalone cluster in Kubernetes.
@@configlearninghub In production it will be better if we use K8s as cluster manager. Am I right?
@@AnkurRanjan-rm5jd apache spark community supports only k8s cluster manager in production.
@@configlearninghub Do you have any articles or videos for this? I am getting confused about this. I am trying to run my PySpark job using k8s cluster
You can try spark-submit to submit a python job in the k8s cluster with python support spark docker image. For submitting job on Kubernetes cluster please watch this video. ua-cam.com/video/6ZYvH7dINig/v-deo.htmlsi=hc_ouqlyXIPGyvKb
Nice
Super
Thankyou
Nice
Nice
4 chin😂😂😂😂
Super
Thanks
Good
Nice
Awesome 🎉
Thank you! Cheers!
Awesome 👌
Thank you! Cheers!
thank you
Welcome!
My ip addr for the kubernetes dashboard is different than your and i am not able to access it the way you were able to by just using ip:port. Any suggestions ?
You should access Kubernetes dashboard using master node ip address with node port. Don't use cluster ip of Kubernetes dashboard service it is internal to the vm.
Can you try an ETL with msk ?
I will try
How do I make a master - backup in a kübernetes environment?
Please check this ua-cam.com/video/_4F_pV04csk/v-deo.html
Thank you for this great toutrial, can you guide us how to build docker file and then develop it with same env. i.e. using jenkins run on pod which create docker image of your project and then upload it to docker hub, after next stage it deploy same image into same kubernetes cluster using deploy file
In the video description section I have mentioned the repository which contains jenkinsagents.txt where you can see build command at the end. You can use that. You can also use the docker image that I have built which is configured in the deployment yaml file
Great
Great video! To the point, and exactly what was needed! Keep us the great work! 👏🏻 👏🏻 👏🏻
Glad it was helpful!
Very nice explanation
Hi bro, can i contact you?
Yes
@configjavatech Hi Sir ji it is very helpful thank you so much for mail guidance 🙏 Sir ji both file work I'm able to see master driver and worker . I want to submit my job to calculate average value in master spark driver. Please where I can write my spark code and how to submit. You send me yaml file where i can able to create spark master driver and worker but how to submit my job in it. One more help sir ji in airflow in connection id I'm able to see MySQL, Amazon web series, postgres but I'm not able to spark connection.
Nice video, Thank you for the video ❣️
I have sent steps for running apache airflow for spark in Kubernetes. Was it helpful?
@@configlearninghub Hi Sir ji yes it is very helpful, 😊 I'm sending my raspberry pi login credentials you. It is like remote machine. Sir ji both attached file work but after creating spark driver and worker node i want to submit my job where I want to write a simple program to calculate the average value. Thank you so much for help dear sir ji. I seen in airflow there connection id where lots of type of connection available like Amazon web services, MySQL, postgres, but there is not available spark. Please i need little more help. I'm really sorry for distributing you but please if you got time i need help. 🙏🙏🙏
@@configlearninghub Hi Sir ji it is very helpful thank you so much 🙏 Sir ji both file work I'm able to see master driver and worker . I want to submit my job to calculate average value in master spark driver. Please where I can write my spark code and how to submit. You send me yaml file where i can able to create spark master driver and worker but how to submit my job in it. One more help sir ji in airflow in connection id I'm able to see MySQL, Amazon web series, postgres but I'm not able to spark connection.
I will share you today my raspberry pi login credenti@ls it is just like remote machine where you can see sir ji my installed applications. 😊 And you can use it as well i will be so happy ❤️
Is the spark job in python language or in java or scala DSL. Where is your dataset stored?