Azure Synapse Analytics - Automating Serverless SQL Views

Поділитися
Вставка
  • Опубліковано 5 жов 2024
  • We're seeing a common pattern emerging of people creating views over openrowset queries so they can then query files in a lake directly from BI Tools like Power BI, using Synapse Serverless Pools - it's a great pattern but people are hand-cranking the view creation... and that's no good.
    In this video Simon talks through how you would go about creating views programmatically, using Serverless SQL Stored Procs & Dynamic SQL to build up the functionality you need to automate this arduous task.
    Details around Serverless Stored Procs can be found here: docs.microsoft...
    If you're getting started with Azure Synapse Analytics, check out advancinganalytics.co.uk/training to see how we can help you on your journey!

КОМЕНТАРІ • 31

  • @Mahmoudalgindy
    @Mahmoudalgindy 5 місяців тому

    Hmm, this means I want to learn more a lot.
    Thanks so much.
    Joining the Data world makes me learning every minute in my life 😫

  • @valentinloghin4004
    @valentinloghin4004 Рік тому +1

    Superb video !!! Do you have a video for the delta lake creation !?

  • @NeumsFor9
    @NeumsFor9 2 роки тому +2

    I think you can run something like EXEC sp_describe_first_result_set @tsql (minus the create or alter view) and the serverless pool will at least give you the inferred types back. However, I think the inference is still not ideal.

  • @rafibasha4780
    @rafibasha4780 2 роки тому +1

    how do we create views from azure dev ops pipeline instead of a data factory?

  • @dperezc88
    @dperezc88 2 роки тому +3

    Excellent video! I developed a solution in which I built external tables on the Serverless pool, but I'm not sure whether I should expose my data using Views or External Tables. Do you have any recommendations for when I should use one over the other?

    • @federicosardo2314
      @federicosardo2314 Рік тому +1

      6 months later I have the same question.

    • @valentinloghin4004
      @valentinloghin4004 Рік тому

      I will create a view in top of the external table

    • @user-pu6su5op3e
      @user-pu6su5op3e Рік тому

      Can we create a view by joining external tables?

    • @valentinloghin4004
      @valentinloghin4004 Рік тому

      creare a view on each external table and join them after

    • @user-pu6su5op3e
      @user-pu6su5op3e Рік тому

      @@valentinloghin4004 oh okay. So it's becoming like 3 step process..
      External table then view and another view. Is there a better approach than 3 step process?

  • @gastondemundo9822
    @gastondemundo9822 2 роки тому

    amazing! , so, with store procedures running under sql serverless you can save a lot of money on queries, which users would never do and you would still be spending money waiting for them. what is the limitation of this? the scale of having more processing? sql pool will allow me to add more pools if I need it but not sql on demand?

  • @dmitryanoshin8004
    @dmitryanoshin8004 3 роки тому

    Will synapse take over azure databricks for azure customers?

  • @krisherpini
    @krisherpini 3 роки тому

    Hi Really great idea!! DO you have those scripts?

  • @mcr132
    @mcr132 2 роки тому

    Great! - Is there any way we can manage the inferred schema to assign proper data types out of data files and reflect those in SQL view? (Instead of accepting all varchar(8000) ?

  • @vasantskr
    @vasantskr 3 роки тому

    Can I connect to Azure synapse (Dedicated Pool) using Data studio and run PySpark commands on the Synapse tables?

  • @JaspreetKaur-bp4ru
    @JaspreetKaur-bp4ru 2 роки тому

    How can we pass multiple values to single parameter through the pipeline. For eg - I will to create 5 views and want to pass those 5 view names via pipeline, so that stored proc gets executed 5 times to create those 5 views. How can we do that?

    • @joachim3390
      @joachim3390 Рік тому

      You can for example with the for each loop & than add a list of paramters for each view

    • @Wannabesage336
      @Wannabesage336 Рік тому

      He covers it at 25:34.

  • @josemadrid4343
    @josemadrid4343 3 роки тому

    The video quality is really bad.

    • @AdvancingAnalytics
      @AdvancingAnalytics  3 роки тому +3

      Sooo I shared the link before HD had finished processing. Give it 20 mins and you should have better options available!
      Simon

    • @allruiz
      @allruiz 3 роки тому

      About to ask if were only me that get the 360p only.

    • @chasedoe2594
      @chasedoe2594 3 роки тому +1

      @@allruiz it usually means the vid is really fresh. HD processing haven't finished yet.

    • @AdvancingAnalytics
      @AdvancingAnalytics  3 роки тому +2

      2 hours later, HD still processing... we'll get there one day folks!

    • @crouch.g
      @crouch.g 3 роки тому +1

      Assuming everything is a varchar(max) isn't ok. The query estimate will prevent the view from working on wide long tables. Answer is to provide the correct schema in the open row set. How could you get the correct schema out of be the parquet files?