- 50
- 1 168 926
coder2j
Germany
Приєднався 22 лют 2021
Welcome to our channel, your ultimate destination for data engineering and data science tutorials based on Python!
Join us as we dive deep into popular tools like Apache Airflow, Apache Spark, PySpark, Pandas, and MySQL, and learn how to harness their power to solve real-world data challenges. Whether you're a beginner or an experienced data professional, our step-by-step tutorials will empower you to master these tools and elevate your data engineering and data science skills.
Subscribe now and join our growing community of data enthusiasts to stay updated with our latest tutorials and take your data expertise to new heights!
🙏 REQUEST VIDEOS
forms.gle/UMp4GA3krcSMMWzy9
Join us as we dive deep into popular tools like Apache Airflow, Apache Spark, PySpark, Pandas, and MySQL, and learn how to harness their power to solve real-world data challenges. Whether you're a beginner or an experienced data professional, our step-by-step tutorials will empower you to master these tools and elevate your data engineering and data science skills.
Subscribe now and join our growing community of data enthusiasts to stay updated with our latest tutorials and take your data expertise to new heights!
🙏 REQUEST VIDEOS
forms.gle/UMp4GA3krcSMMWzy9
LangChain Installation and Setup with OpenAI ChatGPT model | LangChain Tutorial P2
LangChain Installation and Setup with OpenAI ChatGPT model | LangChain Tutorial P2
#langchain #chatbottutorial #largelanguagemodel #coder2j
========== VIDEO CONTENT 📚 ==========
In this video, I will show you how to install LangChain and setup with OpenAI ChatGPT model. Through this tutorial, you will have a proper OpenAI developer account setup and using LangChain to chat with any OpenAI ChatGPT models.
🔔 Subscribe: Don't forget to subscribe to our channel for more exciting tutorials. ua-cam.com/users/coder2j
💬 Leave a comment to let us know what topics you'd like to see in our next tutorial. 🚀
📺 Video Request: forms.gle/UMp4GA3krcSMMWzy9
🧑💻 Want to learn more?
2-hour beginner Airflow Tutorial: ua-cam.com/video/K9AnJ9_ZAXE/v-deo.html
1-hour beginner PySpark Tutorial: ua-cam.com/video/EB8lfdxpirM/v-deo.html
Dagster Tutorial: ua-cam.com/play/PLwFJcsJ61ougyjsQnl8q-P_85uwquMrWv.html
========== T I M E S T A M P ⏰ ==========
Throughout the course, you will learn:
00:00 - Intro
02:03 - Setup OpenAI API KEY
03:06 - Integrate LangChain with OpenAI ChatGPT model
========== L I N K S 🔗 ==========
Airflow 2-hour FULL COURSE 👉 ua-cam.com/video/K9AnJ9_ZAXE/v-deo.html
pyspark Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61ouiU1wvzzRk3pjU8xT9buJhr.html
SQL Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61ouizyVPDIjomZFb2S_zcJebP.html
Airflow Tutorial Tips 👉 ua-cam.com/play/PLwFJcsJ61oujb3syZ7jh72iTF_kL1YgU0.html
Apache Airflow Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61oujAqYpMp1kdUBcPG0sE0QMT.html
========== Connect with me 👏 ==========
Twitter 👉 Coder2j
Website 👉 coder2j.com
GitHub 👉 github.com/coder2j
#langchain #chatbottutorial #largelanguagemodel #coder2j
========== VIDEO CONTENT 📚 ==========
In this video, I will show you how to install LangChain and setup with OpenAI ChatGPT model. Through this tutorial, you will have a proper OpenAI developer account setup and using LangChain to chat with any OpenAI ChatGPT models.
🔔 Subscribe: Don't forget to subscribe to our channel for more exciting tutorials. ua-cam.com/users/coder2j
💬 Leave a comment to let us know what topics you'd like to see in our next tutorial. 🚀
📺 Video Request: forms.gle/UMp4GA3krcSMMWzy9
🧑💻 Want to learn more?
2-hour beginner Airflow Tutorial: ua-cam.com/video/K9AnJ9_ZAXE/v-deo.html
1-hour beginner PySpark Tutorial: ua-cam.com/video/EB8lfdxpirM/v-deo.html
Dagster Tutorial: ua-cam.com/play/PLwFJcsJ61ougyjsQnl8q-P_85uwquMrWv.html
========== T I M E S T A M P ⏰ ==========
Throughout the course, you will learn:
00:00 - Intro
02:03 - Setup OpenAI API KEY
03:06 - Integrate LangChain with OpenAI ChatGPT model
========== L I N K S 🔗 ==========
Airflow 2-hour FULL COURSE 👉 ua-cam.com/video/K9AnJ9_ZAXE/v-deo.html
pyspark Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61ouiU1wvzzRk3pjU8xT9buJhr.html
SQL Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61ouizyVPDIjomZFb2S_zcJebP.html
Airflow Tutorial Tips 👉 ua-cam.com/play/PLwFJcsJ61oujb3syZ7jh72iTF_kL1YgU0.html
Apache Airflow Tutorial for Beginners 👉 ua-cam.com/play/PLwFJcsJ61oujAqYpMp1kdUBcPG0sE0QMT.html
========== Connect with me 👏 ==========
Twitter 👉 Coder2j
Website 👉 coder2j.com
GitHub 👉 github.com/coder2j
Переглядів: 527
Відео
LangChain Introduction | LangChain Tutorial P1
Переглядів 4633 місяці тому
LangChain Introduction | LangChain Tutorial P1 #langchain #chatbottutorial #largelanguagemodel #coder2j VIDEO CONTENT 📚 In this video, I will introduce you what LangChain is. 🔔 Subscribe: Don't forget to subscribe to our channel for more exciting tutorials. ua-cam.com/users/coder2j 💬 Leave a comment to let us know what topics you'd like to see in our next tutorial. 🚀 📺 Video Request: forms.gle/...
What is serverless? | Tech Terms You Should Know
Переглядів 9326 місяців тому
Welcome to the Tech Terms You Should Know series with Coder2j! In today's episode, we're delving into the world of serverless computing. 🚀 Serverless computing revolutionizes cloud development by eliminating the hassle of managing servers, allowing developers to focus solely on coding and deployment. But what exactly is serverless, and how does it work? Join us as we explore the fundamentals of...
Dagster Tutorial: Building Dagster Job & Schedule
Переглядів 3,7 тис.10 місяців тому
Dagster Tutorial: Building Dagster Job & Schedule #DagsterTutorial #DagsterJob #DagsterSchedule #Coder2j VIDEO CONTENT 📚 In this video, I will walk you through the process of building a Dagster job and scheduling it to run periodically. We will start by introducing what is dagster definition. Then defining a job and adding assets to it. Next, we will explore how to create a job with a subset of...
Dagster Tutorial: Building an Asset Graph
Переглядів 3,4 тис.10 місяців тому
Dagster Tutorial: Building an Asset Graph #DagsterTutorial #DagsterAssetDependency #DagsterAsset #Coder2j VIDEO CONTENT 📚 In this video, I will walk you through how to build an asset graph, incorporating multiple assets and managing their dependencies. Learn how to effortlessly create and manage assets, diving into the details of the process step by step. Creating Dagster Assets: Discover the p...
Dagster Tutorial: Dagster Installation and Getting Started with Asset
Переглядів 7 тис.11 місяців тому
Dagster Tutorial: Dagster Installation and Getting Started with Asset #DagsterTutorial #DagsterInstall #DagsterGettingStarted #DagsterAsset VIDEO CONTENT 📚 In this video, I will walk you through how to install Dagster on both macOS and Windows. I start by verifying Python versions, creating virtual environments, and installing necessary packages for Dagster. You'll witness the creation of a Pyt...
Airflow Tutorial: End-to-End Machine Learning Pipeline with Docker Operator
Переглядів 8 тис.11 місяців тому
Airflow Tutorial: End-to-End Machine Learning Pipeline with Docker Operator #AirflowTutorial #AirflowDockerOperator #MachineLearningPipeline #DataEngineering VIDEO CONTENT 📚 In this comprehensive Airflow tutorial, we guide you through the process of creating an end-to-end machine-learning pipeline using the powerful Airflow Docker Operator. Learn how to simplify complex workflows and enhance yo...
PySpark Tutorial for Beginners
Переглядів 86 тис.Рік тому
PySpark Tutorial for Beginners #SparkTutorial #pysparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to this comprehensive 1-hour PySpark tutorial for beginners! In this tutorial, you'll embark on a journey into the world of Apache Spark, where theory meets practical application. No prior PySpark experience is necessary, making it perfect for newcomers. Course Highlights: - Spark Introduction: U...
Spark SQL and SQL Operations | PySpark Tutorial for Beginners
Переглядів 2,4 тис.Рік тому
Spark SQL and SQL Operations | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to our PySpark tutorial series! In this video, we'll explore Spark SQL and SQL operations with PySpark. Learn how to work with DataFrames, create temporary views, and perform advanced SQL operations like subqueries and window functions. Subscribe, enable notificatio...
Spark DataFrame Operations | PySpark Tutorial for Beginners
Переглядів 2 тис.Рік тому
Spark DataFrame Operations | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to our PySpark tutorial series! In this video, we'll explore essential DataFrame operations to supercharge your data analysis. We'll cover loading data from a CSV, and diving into operations like column selection, row filtering, grouping, joining, sorting, distinct fi...
Create Spark DataFrame from CSV JSON Parquet | PySpark Tutorial for Beginners
Переглядів 1,6 тис.Рік тому
Create Spark DataFrame from CSV JSON Parquet | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to our PySpark tutorial series! In this video, we'll guide you through the process of reading data from various sources like CSV and JSON files. We'll cover reading files with different options, such as headers and explicit schemas. You'll also learn...
Spark DataFrame Intro & vs RDD | PySpark Tutorial for Beginners
Переглядів 1,5 тис.Рік тому
Spark DataFrame Intro & vs RDD | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to our PySpark tutorial series! In this video, we delve into DataFrames, a powerful abstraction for distributed and structured data. Learn their benefits over RDDs: optimized execution, user-friendly interface, ecosystem integration, built-in optimization, and int...
Spark RDD Transformations and Actions | PySpark Tutorial for Beginners
Переглядів 3,7 тис.Рік тому
Spark RDD Transformations and Actions | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome to our PySpark tutorial series! In this video, we delve into the world of Spark RDD transformations and actions. Spark RDDs (Resilient Distributed Datasets) are the building blocks of distributed data processing in Apache Spark. Learn how to leverage the po...
Create SparkSession in PySpark | PySpark Tutorial for Beginners
Переглядів 2,6 тис.Рік тому
Create SparkSession in PySpark | PySpark Tutorial for Beginners #SparkTutorial #PySparkTutorial #ApacheSpark VIDEO CONTENT 📚 Welcome back to our PySpark tutorial series! In this video, learn how to create a SparkSession in PySpark to establish a connection to a Spark cluster. Follow the simple steps of importing the necessary module, configuring the session using the builder pattern, and settin...
Create SparkContext in PySpark | PySpark Tutorial for Beginners
Переглядів 2,4 тис.Рік тому
Create SparkContext in PySpark | PySpark Tutorial for Beginners
SparkContext vs SparkSession | PySpark Tutorial for Beginners
Переглядів 4,5 тис.Рік тому
SparkContext vs SparkSession | PySpark Tutorial for Beginners
Spark Installation on Windows 10 and Mac | PySpark Tutorial for Beginners
Переглядів 9 тис.Рік тому
Spark Installation on Windows 10 and Mac | PySpark Tutorial for Beginners
Spark Introduction | PySpark Tutorial for Beginners
Переглядів 9 тис.Рік тому
Spark Introduction | PySpark Tutorial for Beginners
Airflow Email Notification on Failure | Airflow Tutorial Tips 4
Переглядів 12 тис.Рік тому
Airflow Email Notification on Failure | Airflow Tutorial Tips 4
pandas select rows and columns with loc
Переглядів 156Рік тому
pandas select rows and columns with loc
How to create S3 connection for AWS and MinIO in latest airflow version | Airflow Tutorial Tips 3
Переглядів 9 тис.Рік тому
How to create S3 connection for AWS and MinIO in latest airflow version | Airflow Tutorial Tips 3
MySQL Sorting, Order By and Group By | SQL Tutorial For Beginners (MySQL)
Переглядів 328Рік тому
MySQL Sorting, Order By and Group By | SQL Tutorial For Beginners (MySQL)
MySQL SELECT WHERE IN Between And LIKE LIMIT Filter Rows | SQL Tutorial For Beginners (MySQL)
Переглядів 424Рік тому
MySQL SELECT WHERE IN Between And LIKE LIMIT Filter Rows | SQL Tutorial For Beginners (MySQL)
MySQL data import and export, load data infile, into outfile | SQL Tutorial For Beginners (MySQL)
Переглядів 3,5 тис.Рік тому
MySQL data import and export, load data infile, into outfile | SQL Tutorial For Beginners (MySQL)
Create Insert Update Delete Drop MySQL Table | SQL Tutorial For Beginners (MySQL)
Переглядів 786Рік тому
Create Insert Update Delete Drop MySQL Table | SQL Tutorial For Beginners (MySQL)
Robinson Nancy Thomas Joseph White Laura
Bro I'm getting an error Can you help me out?
Thank you for not to be a indian voice.
Continue making videos please
And what about installing packages with _PIP_ADDITIONAL_REQUIREMENTS in docker-compose.yaml directly?
amazing! Thanks for the share
I need your help. I downloaded Airflow version 2.10.2. When I select the connection type, it doesn't have Amazon S3 even though I have Amazon provider with version 8.28.0.
When I run the server and open the localhost link, There are no DAGS listed there. What could be the issue?
thank you for this tutorial
I think airflow installation is getting complicated nowadays for a developer with sound knowledge of infra , this video made my day
Excellent work. Thanks so much for the content.
Hi liked your video. Can you also upload the pdfs as well for our reference
How do I stop airflow webserwer?
Gonzalez Carol Gonzalez Melissa Miller Kevin
Hi, if xcom is not suitable for large data. How can we share data then between tasks? If it is mentioned in another video, could you tell me its number?
Good video, thanks for sharing! For anyone wondering how to run spark in Kubernetes i have a workshop/tutorial for it! ua-cam.com/video/Vwhaq4ezmaw/v-deo.html
Crystal clear, thx!!!🎉
I am unable to see the log details in stdout and stderr. Kindly suggest
what can be done when // spark=SparkSession.builder.appName('Practise').getOrCreate() is keep on running // is keep on running and not getting executed.
[2024-09-14, 10:25:59 UTC] {taskinstance.py:441} ▼ Post task execution logs [2024-09-14, 10:25:59 UTC] {taskinstance.py:2905} ERROR - Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/taskinstance.py", line 465, in _execute_task result = _execute_callable(context=context, **execute_callable_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/taskinstance.py", line 432, in _execute_callable return execute_callable(context=context, **execute_callable_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/airflow/models/baseoperator.py", line 401, in wrapper return func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/airflow/operators/python.py", line 235, in execute return_value = self.execute_callable() ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/airflow/operators/python.py", line 252, in execute_callable return self.python_callable(*self.op_args, **self.op_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/airflow/dags/dag_with_postgres_hooks.py", line 13, in postgres_to_s3 conn = hook.get_conn() ^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/postgres/hooks/postgres.py", line 175, in get_conn self.conn = psycopg2.connect(**conn_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/airflow/.local/lib/python3.12/site-packages/psycopg2/__init__.py", line 122, in connect conn = _connect(dsn, connection_factory=connection_factory, **kwasync) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc2 in position 83: invalid continuation byte [2024-09-14, 10:25:59 UTC] {taskinstance.py:1206} INFO - Marking task as FAILED. dag_id=dag_with_postgres_hooks_v01, task_id=postgres_to_s3, run_id=scheduled__2024-09-03T00:00:00+00:00, execution_date=20240903T000000, start_date=20240914T102555, end_date=20240914T102559 [2024-09-14, 10:25:59 UTC] {standard_task_runner.py:110} ERROR - Failed to execute job 144 for task postgres_to_s3 ('utf-8' codec can't decode byte 0xc2 in position 83: invalid continuation byte; 4317) [2024-09-14, 10:25:59 UTC] {local_task_job_runner.py:243} INFO - Task exited with return code 1
Thank you for this tutorial ! Magnificent !!
Thanks for the video, but I have a question, so are we using hooks here instead of the providers you showed before?
Mason Terrace
I found this to be an excellent guided tour, but only because I've gone through a lot of tutorials on webpages, books, and videos. PySpark is so vast and diverse that, after seeing all these other tutorials, one needs a video like this one to knit them all together.
Glad it helped! ☺️
great course, thanks a lot ! I have a slight problem though with my scheduler and webserver containers in the part 1:24:36 , my containers constantly keep switching between restarting and running status , so i am not able to access the webserver , i tried the steps again but i still can't solve the issue. Help would be appreciated
Great course look forward to watching more content
setup tutorial is officially obsolete. if you're reading this go and find another one lol
thank you sooo much this was really awesome
Do i need to create a new env after closing or restarting a session ?
Wasn't expecting a 50 min video to be this informative! Covered all the major topics related to PySpark. Truly, a well structured video and quality content. Thanks!
Wonderful course for beginners like me. Very helpful. Thanks a lot
I have an issue raise error_class(parsed_response, operation_name) botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden How do we resolve it?
setting Extra in connection like below and AWS Access Key ID, AWS Secret Access Key with respective values worked for me { "host": "host.docker.internal:9000" }
The airflow 2.10.1 is not liking when we give the extras like this { "aws_access_key_id": "mZtRj1zvXa4uAv0fIRqY", "aws_secret_acess_key": "7KpTz9q67OEJX1sAN7b02sB803OsYNILXpz5XoSJ", "host": "host.docker.internal:9000" }
Hope this will be helpful for all
I think the access is misspelled.
Thanks for sharing this.
If you are using minio local and after doing everything right, you still get this error "botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden". Ensure you send go to "Configuration" in Minio Local and set the "Server Location" to "us-east-1"
Where is the pyspark ML? you promised :'(
13:49 give error for me "spark not defined"
I am unable to get the jupyter notebooks to work. It keeps complaining about py4j. I am able to run the commands in a terminal tho. Not sure what the setup is supposed to be. I wish this video started with how to install everything.
Amazing videos
ERROR: failed to solve: process "/bin/bash -o pipefail -o errexit -o nounset -o nolog -c pip install --user --upgrade pip" did not complete successfully: exit code: 1 this error i get when i build image my script: FROM apache/airflow:2.9.3 COPY requirements.txt /requirements.txt RUN pip install --user --upgrade pip RUN pip install --no-cache-dir --user -r /requirements.txt
The error happens within the upgrade pip command, can you try removing the --user part?
@@coder2j yes i did it and it helps me.
Though I have followed all the steps to create DAG with postgres operator, The DAG is not refelecting in the airflow page. Everything before that worked well. What will be the issue? from airflow.providers.postgres.operators.postgres import PostgresOperator somehwere this module is not present it seems. Any help on this will be useful
Thanks for the video. Anyone else got this error? ...{connection_wrapper.py:384} INFO - AWS Connection (conn_id='minio', conn_type='aws') credentials retrieved from login and password. ... ERROR - Task failed with exception ... raise EndpointConnectionError(endpoint_url=request.url, error=e) botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "localhost:9000/***/first_100_rows.csv" ... __________ For some reason it shows the url with *** instead of the bucket name. Maybe this has to do with it? Anyone else had this issue? After half a day, still haven't figured it out
I made sure that minio and airflow are on the same network, so I know thats not the issue
Great videos, thank you very much for the help, it would be great if you continued with more explanations, such as how the daemon works or how to link a Jupyter Notebook to a job and show the logs of the Jupyter Notebook processes. thank you so much
I am trying to follow along, however, i got the error when running df.show(). I am running on windows and use jdk 21 version. Py4JJavaError: An error occurred while calling o42.showString. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (PIR-HRHNTW3-XZ.fios-router.home executor driver): org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
You saved me with the last clip. I had complicated docker-compose and couldn't figure out why the examples keep showing even though I turned everything off. The docker-compose file had the load examples hard coded to true. Duhhhh!~
Awesome. Glad it worked!
how could I solve ERROR - Failed to establish connection to Docker host unix:///var/run/docker.sock: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) problem? I have tried docker version, unix:///var/run/docker.sock is running well.
Did you start docker desktop or docker services?
@@coder2jHi, thank you for your reply. I finally fixed it by adding "- /var/run/docker.sock:/var/run/docker.sock" under volumes in the docker-compose.yml file. Could you explain why? Thank you for your amazing walk-through. Great videos for starting to learn Airflow. I am looking forward to learning about PySpark on your channel.
Glad you figured it out. It tells the docker operator the path of the docker demon to run the docker container. Enjoy learning! Do let me know if you have any feedback! 😉
This is the latest and most beginner friendly tutorial regarding apache spark. As an aspiring data engineering this is absolutely helpful! If you ever have free time, I would like to request creating a tutorial series regarding docker, dbt, and kafka, also if possible could you add more tutorials for dagster. As a student, out of all the tutorials in youtube I was able to follow you tutorial the best. Thank you for the airflow and pyspark playlist. I hope you still continue recording 😊
Thank you for your feedback. I will definitely consider that! 😍
How can we setup this for multiple environments like Dev, Prod can you please guide us through?
You can use the same docker compose config and deploy them to different virtual machines or ec2 for staging and production environments
I learned a lot with this video, thank you
Glad it helped!
Outstanding tutorial! I hope you can release a Kafka tutorial.
Thank you for the feedback!
hello brotyer my example dags are visibling after set load_examples to false
Make sure you have set the liad_example in the right configuration file and you need to restart airflow after that
I did perfectly you done on the video and restarted well please help me brother
@kitty8170 check the value of load example using airflow cli: $ airflow config get-value core load_example
@@coder2j i did it brother ,its showing true in cli command but i have false i also stopped and restarted
@@coder2j its working now thanks!!
is there another way to add these credentials to airflow without doing it from within the UI?
You can do it via environment variables or airflow cli. airflow.apache.org/docs/apache-airflow/stable/howto/connection.html
@@coder2j thank you