- 248
- 68 512
Cloud Architect Abhiram
India
Приєднався 6 гру 2023
Welcome to our Channel☁️
Dive into the transformative world of cloud-based technology with us! Whether you're curious about Cloud Technology or the latest trends in cloud infrastructure, you'll find it all here.
Are you seeking opportunities to get a placement in IT Sector? our Training and Placements program helps you get placed.
🎓 Learn from Industry Experts: Our courses are designed & taught by professional who know the industry inside out.
🎯 Don't miss this opportunity to become a sought-after IT professional. Enroll now & secure your future in tech!
🏁 Start your career growth today 📈 unlock the door of opportunities
Dive into the transformative world of cloud-based technology with us! Whether you're curious about Cloud Technology or the latest trends in cloud infrastructure, you'll find it all here.
Are you seeking opportunities to get a placement in IT Sector? our Training and Placements program helps you get placed.
🎓 Learn from Industry Experts: Our courses are designed & taught by professional who know the industry inside out.
🎯 Don't miss this opportunity to become a sought-after IT professional. Enroll now & secure your future in tech!
🏁 Start your career growth today 📈 unlock the door of opportunities
How to Find out the Total Winning Count of each team | Pyspark Realtime Scenario #pyspark #azure
How to Find out the Total Winning Count of each team | Pyspark Realtime Scenario #pyspark #azure
Переглядів: 320
Відео
How to Concate two lists using row wise | Python Coding Challenge #pythontutorial #python
Переглядів 1454 місяці тому
How to Concate two lists using row wise | Python Coding Challenge #pythontutorial #python
How to Merge two lists using Loop | Python Coding Challenge #pythontutorial #python
Переглядів 614 місяці тому
How to Merge two lists using Loop | Python Coding Challenge #pythontutorial #python
How to Concate two lists in Elements Manner | Python Coding Challenge #pythontutorial #python
Переглядів 504 місяці тому
How to Concate two lists in Elements Manner | Python Coding Challenge #pythontutorial #python
How to Remove Multiple Elements From List | Python Coding Challenge #pythontutorial #python
Переглядів 604 місяці тому
How to Remove Multiple Elements From List | Python Coding Challenge #pythontutorial #python
How to remove element from list using Python | Python Coding Challenge #pythontutorial #python
Переглядів 264 місяці тому
How to remove element from list using Python | Python Coding Challenge #pythontutorial #python
How to Find Most Frequent Elements in list | Python Coding Challenge #pythontutorial #python
Переглядів 324 місяці тому
How to Find Most Frequent Elements in list | Python Coding Challenge #pythontutorial #python
How to find strings in list | Python Coding Challenge #coding #pythontutorial #python
Переглядів 304 місяці тому
How to find strings in list | Python Coding Challenge #coding #pythontutorial #python
How to Find Returning Active Users Using PySpark | Pyspark Realtime Scenario #pyspark #azure
Переглядів 1194 місяці тому
How to Find Returning Active Users Using PySpark | Pyspark Realtime Scenario #pyspark #azure
List of Airlines Operating Flights to all destinations | Pyspark Realtime Scenario #pyspark #azure
Переглядів 1074 місяці тому
List of Airlines Operating Flights to all destinations | Pyspark Realtime Scenario #pyspark #azure
How to identify products with increasing yearly sales | Pyspark Realtime Scenario #pyspark #azure
Переглядів 814 місяці тому
How to identify products with increasing yearly sales | Pyspark Realtime Scenario #pyspark #azure
How to find length of List in Python | Python Coding Challenge #pythonprogramming #pythontutorial
Переглядів 604 місяці тому
How to find length of List in Python | Python Coding Challenge #pythonprogramming #pythontutorial
Interchange First & Last Elements in List | Python Coding Challenge #coding #pythontutorial #python
Переглядів 724 місяці тому
Interchange First & Last Elements in List | Python Coding Challenge #coding #pythontutorial #python
How to hide mobile number digits in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 5414 місяці тому
How to hide mobile number digits in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
How to Swap Seat Ids in PySpark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 2784 місяці тому
How to Swap Seat Ids in PySpark | Pyspark Realtime Scenario #pyspark #databricks #azure
How to add Filename to Data frame in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 3454 місяці тому
How to add Filename to Data frame in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Cumulative Salary of Employee in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 3364 місяці тому
Cumulative Salary of Employee in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
PySpark program to find customers who purchased all products from product table
Переглядів 1174 місяці тому
PySpark program to find customers who purchased all products from product table
How to find the customer who not placed any order in order table in PySpark | Realtime Scenario
Переглядів 1264 місяці тому
How to find the customer who not placed any order in order table in PySpark | Realtime Scenario
How to handle Multi Delimiters in PySpark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 844 місяці тому
How to handle Multi Delimiters in PySpark | Pyspark Realtime Scenario #pyspark #databricks #azure
Count rows in each column where nulls present in Data Frame | Pyspark Realtime Scenario #pyspark
Переглядів 1284 місяці тому
Count rows in each column where nulls present in Data Frame | Pyspark Realtime Scenario #pyspark
Remove Duplicates in PySpark | Pyspark Realtime Scenario
Переглядів 1504 місяці тому
Remove Duplicates in PySpark | Pyspark Realtime Scenario
Simple Data Frame Creation in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Переглядів 2234 місяці тому
Simple Data Frame Creation in Pyspark | Pyspark Realtime Scenario #pyspark #databricks #azure
Azure Data Factory Storage Account Creation || Azure Portal #azuredatafactory #azureportal #azure
Переглядів 1176 місяців тому
Azure Data Factory Storage Account Creation || Azure Portal #azuredatafactory #azureportal #azure
How to Become an Azure Data Engineer? || Data Engineer Tutorial
Переглядів 1996 місяців тому
How to Become an Azure Data Engineer? || Data Engineer Tutorial
Azure Data Bricks Tutorial || Introduction to Data Bricks #azuredatabricks #azuredatafactory #azure
Переглядів 1466 місяців тому
Azure Data Bricks Tutorial || Introduction to Data Bricks #azuredatabricks #azuredatafactory #azure
Informatica IICS || JSON Parsing & Process Creation || #informatica #json #parsing
Переглядів 3176 місяців тому
Informatica IICS || JSON Parsing & Process Creation || #informatica #json #parsing
Informatica Power Center to IICS Migration | #informatica #iics #informaticapowercenter
Переглядів 1,6 тис.6 місяців тому
Informatica Power Center to IICS Migration | #informatica #iics #informaticapowercenter
Informatica IICS || Lookup & Unconnected Lookup || #informatica #iics #itjobs2024
Переглядів 2286 місяців тому
Informatica IICS || Lookup & Unconnected Lookup || #informatica #iics #itjobs2024
Informatica IICS || Synchronization Task & Replication Task #iics #itjobs2024 #informatica
Переглядів 876 місяців тому
Informatica IICS || Synchronization Task & Replication Task #iics #itjobs2024 #informatica
Please remove background sound
Hello sir meru reel peduthunnaru koddhiga explain cheyocchuga kadha sir
Hello Abhiram Sir ... Today's GCP interview question?
I believe you can read documentation very well rather than understanding it
what is the age criteria and career criteria for Jobs after getting IICS Training. Please Give Genuine Answer, is it possible to at the age in 38.
Starting or progressing in an IT profession, particularly positions utilizing Informatica Intelligent Cloud Services (IICS), is rarely limited by age. Your abilities to properly communicate your knowledge and capabilities to prospective employers are what really count.
@@CloudMaster_Abhiram Tq for Quick Response, You mean Age is not Mandatory in IT Field over 38 as well,
@@SB-ln3ln Whether you're in your 30s, 40s or 50s, it's not too late to take actionable steps to change your career.
Music 😅
may i have your contact to have training on pyspark
Reach us over WhatsApp: +91 9281106429
may i have your contact to have training on pyspark
may i have your contact to have training on pyspark
Method: 1 Reading from CSV file from pyspark.sql import SparkSession spark = SparkSession.builder.appName("DataFrame creation example").getOrCreate() df = spark.read.csv("file_path.csv", header=True, inferSchema=True Method: 2 Reading from JSON file from pyspark.sql import SparkSession spark = SparkSession.builder.appName("DataFrame creation example").getOrCreate() df = spark.read.json("file_path.json" Method 3: Reading from Parquet file: from pyspark.sql import SparkSession spark = SparkSession.builder.appName("DataFrame creation example").getOrCreate() df = spark.read.parquet("file_path.parquet" Method 4: Reading from a database from pyspark.sql import SparkSession spark = SparkSession.builder.appName("DataFrame creation example").getOrCreate() jdbc_url = "jdbc:mysql://localhost:3306/mydb" table_name = "my_table" properties = {"user": "my_user", "password": "my_password"} df = spark.read.jdbc(url=jdbc_url, table=table_name, properties=properties)
Syntax: from pyspark.sql import SparkSession # Initialize SparkSession spark = SparkSession.builder.appName("DFSize").getOrCreate() # create DataFrame df = spark.range(1000000) # Force cache DataFrame df.cache().count() # Get size of DataFrame in bytes size_in_bytes = sc._jvm.org.apache.spark.util.SizeEstimator.estimate(df._jdf) # size in megabytes size_estimate_mb = size_estimate_bytes / (1024**2) # size in gigabytes size_estimate_gb = size_estimate_bytes / (1024**3) df.unpersist()
Syntax for Pipelines in PySpark MLlib from pyspark.ml import Pipeline from pyspark.ml.feature import StringIndexer from pyspark.ml.classification import RandomForestClassifier indexer = StringIndexer(inputCol=’category’, outputCol=’categoryIndex’) rf = RandomForestClassifier(featuresCol=’features’, labelCol=’categoryIndex’) pipeline = Pipeline(stages=[indexer, rf]) model = pipeline.fit(trainingData)
Syntax: PySpark’s MLlib for machine learning tasks from pyspark.ml.classification import LogisticRegression from pyspark.ml.feature import VectorAssembler assembler = VectorAssembler(inputCols=[‘feature1’, ‘feature2’], outputCol=’features’) df_transformed = assembler.transform(df) lr = LogisticRegression(featuresCol=’features’, labelCol=’label’) model = lr.fit(df_transformed)
Syntax for Find the top N most frequent words in a large text file from pyspark import SparkContext # create your spark context sc = SparkContext("local", "WordCount") # import a text file from a local path lines = sc.textFile("path/to/your/text/file.txt") # split and map the words # then reduce by using the words as keys and add to the count word_counts = lines.flatMap(lambda line: line.split(" ")) \ .map(lambda word: (word, 1)) \ .reduceByKey(lambda a, b: a + b) # order the words and take only the top N frequent words top_n_words = word_counts.takeOrdered(N, key=lambda x: -x[1]) print(top_n_words)
import pyspark from pyspark.sql import SparkSession from pyspark.sql.functions import expr #Create spark session data = [("Devara",1000,"India"), ("Kalki",1500,"India"), ("Pushpa",1600,"India"), \ ("Devara",4000,"USA"), \ ("Pushpa",1200,"USA"),("Kalki",1500,"USA"), \ ("Pushpa",2000,"Canada"),("Kalki",2000,"Canada"),("Devara",2000,"Mexico")] columns= ["Product","Amount","Country"] df = spark.createDataFrame(data = data, schema = columns) df.printSchema() df.show(truncate=False) Output: root | -- Product: string (nullable = true) | -- Amount: long (nullable = true) | -- Country: string (nullable = true) +-------------+-------------+-------------+ | Product | Amount | Country | +-------------+-------------+--------------+ | Devara | 1000 | India | | Pushpa | 1600 | India | | Kalki | 1500 | India | | Devara | 4000 | USA | | Pushpa | 1200 | USA | | Kalki | 1500 | USA | | Devara | 2000 | Mexico | | Pushpa | 2000 | Canada | | Kalki | 2000 | Canada | +-------------+--------------+--------------+ To determine the entire amount of each product's exports to each nation, we'll group by Product, pivot by Country, and sum by Amount. pivotDF = df.groupBy("Product").pivot("Country").sum("Amount") pivotDF.printSchema() pivotDF.show(truncate=False) This will convert the nations from DataFrame rows to columns, resulting in the output seen below. Output: root | -- Product: string (nullable = true) | -- Canada: long (nullable = true) | -- Mexico: long (nullable = true) | -- USA: long (nullable = true) | -- India: long (nullable = true) +-------------+-------------+------------+-------+---------+ | Product | Canada | Mexico | USA | India | +-------------+-------------+------------+--------+---------+ | Devara | null | 2000 | 4000 |1000 | | Pushpa | 2000 | null | 1200 | 1600 | | Kalki | 2000 | null | 1500 | 1500 | +-------------+-------------+------------+---------+---------+
@@CloudMaster_Abhiram thanks I do follow your channel For spark related stuffs❤️ I want to do deep preparation for spark how should I go ahead 😅
@@Rohit-r1q1h How about getting enrolled in us
Syntax kidhr he 😂
Haha i forgot just pinned the comment do check it.
Good.. keep it up
Please do not add background music it is too much distracting. Or you can add silent music
Hi Abhiram Could you please let me know how to migrate job from datastage to informatica power center or iics
from pyspark.sql import SparkSession, Row from pyspark.sql import functions as F # Initialize Spark session spark = SparkSession.builder.appName("concatenate_columns").getOrCreate() # Sample DataFrame data = [ Row(struct_col=Row(a=1, b="foo"), array_col=[Row(c=3, d="bar"), Row(c=4, d="baz")]) ] df = spark.createDataFrame(data) # Flatten the struct column flattened_struct_col = F.concat_ws( ",", *[F.col("struct_col." + field.name) for field in df.schema["struct_col"].dataType.fields] ) # Flatten the array of structs column flattened_array_col = F.expr(""" concat_ws(",", transform(array_col, x -> concat_ws(",", x.*))) """) # Concatenate the two columns df = df.withColumn( "concatenated_col", F.concat_ws(",", flattened_struct_col, flattened_array_col) ) # Show result df.show(truncate=False)
Please don't put any background music and give little more time to read the slides
@@abhishekshah581 Noted
Kindly stop that background music
😅ok from next time new bgm 👍
Pls send data file
Voice rale sir
Do watch entire video. In the end you will find the overview explanation clearly
No voice
Watch entire video
@@CloudMaster_Abhiram chusina sir
@@irugugopi203 Do watch entire video. In the end you will find the overview explanation clearly
Thank you so much. Starting lo nay input data tho dataframe create chesay time lo arguments error vachindhi kadha. Adhi okasari cheppandi anna fix
Nice xplaining sir
Nice explanation sir
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉 Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉 Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉 Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉 Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 👉Features of Online Training: 👉 Real-Time Oriented Training 👉Live Training Sessions 👉Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 Features of Online Training: 👉Real-Time Oriented Training 👉Live Training Sessions 👉 Interview Preparation Tips 👉FAQ’s 👉100% Job Guarantee Program 👉Mock Interviews
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 Features of Online Training Real-Time Oriented Training Live Training Sessions Interview Preparation Tips FAQ’s 100% Job Guarantee Program Mock Interviews
How to join
Contact me +91 9281106429
Spark=sparksession.builder.master THIS LINE NEED!?
What about standard cluster....?
It is a set of computation resources and configurations on which you run data engineering, data science, and data analytics workloads, such as production ETL pipelines, streaming analytics, ad-hoc analytics, and machine learning.
Hi sir can you please share document for what you are explained in short videos about azure data bricks interview questions
👍
😄😄
Thanks Abhiram for this informative content.
Thank you for your feedback! I'm glad you found the content informative.
👍
Enroll Now: "Azure Data Engineer Training & Placement Program" Start Date: Every Month 1st week || 7:00 pm IST For More Details: Call: +91 9281106429 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 Features of Online Training Real-Time Oriented Training Live Training Sessions Interview Preparation Tips FAQ’s 100% Job Guarantee Program Mock Interviews
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews
Good explanation
Thanks and welcome
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews
✍Enroll Now: "Azure Data Engineer Training & Placement Program" 📅 Start Date: Every Month 1st week || 7:00 pm IST For More Details: 📱 Call: +91 9281106429 👉 Chat with us on WhatsApp: wa.me/qr/PSW2ILTYJHTZI1 💥 Features of Online Training ✅ Real-Time Oriented Training ✅ Live Training Sessions ✅ Interview Preparation Tips ✅ FAQ’s ✅ 100% Job Guarantee Program ✅ Mock Interviews